Selective inference using randomized group lasso estimators for general models (2306.13829v3)
Abstract: Selective inference methods are developed for group lasso estimators for use with a wide class of distributions and loss functions. The method includes the use of exponential family distributions, as well as quasi-likelihood modeling for overdispersed count data, for example, and allows for categorical or grouped covariates as well as continuous covariates. A randomized group-regularized optimization problem is studied. The added randomization allows us to construct a post-selection likelihood which we show to be adequate for selective inference when conditioning on the event of the selection of the grouped covariates. This likelihood also provides a selective point estimator, accounting for the selection by the group lasso. Confidence regions for the regression parameters in the selected model take the form of Wald-type regions and are shown to have bounded volume. The selective inference method for grouped lasso is illustrated on data from the national health and nutrition examination survey while simulations showcase its behaviour and favorable comparison with other methods.
- Acosta, A. D. (1992). Moderate deviations and associated Laplace approximations for sums of independent random vectors. Transactions of the American Mathematical Society, 329(1):357–375.
- Uniformly valid confidence intervals post-model-selection. The Annals of Statistics, 48:440–463.
- Valid post-selection inference. The Annals of Statistics, 41(2):802–837.
- Model selection: an integral part of inference. Biometrics, 53:603–618.
- A scale-free approach for false discovery rate control in generalized linear models. Journal of the American Statistical Association, 118(543):1551–1565. Davison, (2003) Davison, A. C. (2003). Statistical Models. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press. Dembo and Zeitouni, (2009) Dembo, A. and Zeitouni, O. (2009). Large Deviations Techniques and Applications. Springer Berlin, Heidelberg. Dharamshi et al., (2023) Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Davison, A. C. (2003). Statistical Models. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press. Dembo and Zeitouni, (2009) Dembo, A. and Zeitouni, O. (2009). Large Deviations Techniques and Applications. Springer Berlin, Heidelberg. Dharamshi et al., (2023) Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Dembo, A. and Zeitouni, O. (2009). Large Deviations Techniques and Applications. Springer Berlin, Heidelberg. Dharamshi et al., (2023) Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Davison, A. C. (2003). Statistical Models. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press. Dembo and Zeitouni, (2009) Dembo, A. and Zeitouni, O. (2009). Large Deviations Techniques and Applications. Springer Berlin, Heidelberg. Dharamshi et al., (2023) Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Dembo, A. and Zeitouni, O. (2009). Large Deviations Techniques and Applications. Springer Berlin, Heidelberg. Dharamshi et al., (2023) Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Large Deviations Techniques and Applications. Springer Berlin, Heidelberg. Dharamshi et al., (2023) Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Dharamshi, A., Neufeld, A., Motwani, K., Gao, L. L., Witten, D., and Bien, J. (2023). Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Generalized data thinning using sufficient statistics. arXiv preprint arXiv:2303.12931. Draper, (1995) Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, Series B, 57:45–97. With discussion and a reply by the author. Duy and Takeuchi, (2022) Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Duy, V. N. L. and Takeuchi, I. (2022). More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- More powerful conditional selective inference for generalized lasso by parametric programming. Journal of Machine Learning Research, 23(300):1–37. Fithian et al., (2014) Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Fithian, W., Sun, D., and Taylor, J. (2014). Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Optimal inference after model selection. ArXiv.1410.2597. Golub and Van Loan, (1996) Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Golub, G. and Van Loan, C. (1996). Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Matrix Computations. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins University Press. Groll et al., (2019) Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Groll, A., Hambuckers, J., Kneib, T., and Umlauf, N. (2019). LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- LASSO-type penalization in the framework of generalized additive models for location, scale and shape. Computational Statistics & Data Analysis, 140:59–73. Harville, (1997) Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer New York, NY. He et al., (2023) He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. He, X., Pan, X., Tan, K. M., and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Smoothed quantile regression with large-scale inference. Journal of Econometrics, 232:367–388. Hjort and Claeskens, (2003) Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hjort, N. L. and Claeskens, G. (2003). Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Frequentist model average estimators. Journal of the American Statistical Association, 98:879–899. With discussion and a rejoinder by the authors. Hurvich and Tsai, (1990) Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Hurvich, C. M. and Tsai, C.-L. (1990). The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- The impact of model selection on inference in linear regression. The American Statistician, 44:214–217. Kim et al., (2022) Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S., Liao, Y.-T., and Ramanan, K. (2022). An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- An asymptotic thin shell condition and large deviations for random multidimensional projections. Advances in Applied Mathematics, 134:102306. Kim and Ramanan, (2023) Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kim, S. S. and Ramanan, K. (2023). Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Large deviation principles induced by the stiefel manifold, and random multidimensional projections. Electronic Journal of Probability, 28:1–23. Kivaranovic and Leeb, (2021) Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Kivaranovic, D. and Leeb, H. (2021). On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- On the length of post-model-selection confidence intervals conditional on polyhedral constraints. Journal of the American Statistical Association, 116(534):845–857. La Cour and Schieve, (2015) La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. La Cour, B. R. and Schieve, W. C. (2015). A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- A general conditional large deviation principle. Journal of Statistical Physics, 161(1):123–130. Lee et al., (2016) Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J., Sun, D., Sun, Y., and Taylor, J. (2016). Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Exact post-selection inference, with application to the lasso. The Annals of Statistics, 44(3):907–927. Lee, (2003) Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Lee, J. M. (2003). Introduction to Smooth Manifolds. Springer Science & Business Media, 2 edition. Leiner et al., (2024) Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Leiner, J., Duan, B., Wasserman, L., and Ramdas, A. (2024). Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Data fission: splitting a single data point. Journal of the American Statistical Association, page to appear. Li et al., (2022) Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Li, L., Sun, W., Luo, J., and Huang, H. (2022). Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Associations between education levels and prevalence of depressive symptoms: NHANES (2005–2018). Journal of Affective Disorders, 301:360–367. Liu and Panigrahi, (2023) Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Liu, S. and Panigrahi, S. (2023). Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Selective inference with distributed data. arXiv:2301.06162. Meier et al., (2008) Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Meier, L., van de Geer, S., and Bühlmann, P. (2008). The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- The group lasso for logistic regression. Journal of the Royal Statistical Society Series B: Statistical Methodology, 70(1):53–71. Min and Zhou, (2021) Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Min, S. and Zhou, Q. (2021). Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Constructing confidence sets after lasso selection by randomized estimator augmentation. arXiv.1904.08018. Panigrahi, (2023) Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Panigrahi, S. (2023). Carving model-free inference. The Annals of Statistics, 51(6):2318–2341. (31) Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Fry, K., and Taylor, J. (2022a). Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Exact selective inference with randomization. arXiv preprint arXiv:2212.12940. Panigrahi et al., (2023) Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., MacDonald, P. W., and Kessler, D. (2023). Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Approximate post-selective inference for regression with the group lasso. Journal of Machine Learning Research, 24(79):1–49. (33) Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Mohammed, S., Rao, A., and Baladandayuthapani, V. (2022b). Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Integrative Bayesian models using post-selective inference: A case study in radiogenomics. Biometrics, page to appear. Panigrahi and Taylor, (2022) Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S. and Taylor, J. (2022). Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Approximate selective inference via maximum likelihood. Journal of the American Statistical Association, to appear:1–11. Panigrahi et al., (2021) Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Panigrahi, S., Taylor, J., and Weinstein, A. (2021). Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Integrative methods for post-selection inference under convex constraints. The Annals of Statistics, 49(5):2803–2824. Rasines and Young, (2023) Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rasines, D. G. and Young, G. A. (2023). Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Splitting strategies for post-selection inference. Biometrika, 110:597–614. Rigby and Stasinopoulos, (2005) Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A. and Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Generalized additive models for location, scale and shape. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(3):507–554. Rigby et al., (2019) Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Rigby, R. A., Stasinopoulos, M. D., Heller, G. Z., and De Bastiani, F. (2019). Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Distributions for Modeling Location, Scale, and Shape: Using GAMLSS in R. Chapman & Hall/CRC The R Series. Roth and Fischer, (2008) Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Roth, V. and Fischer, B. (2008). The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- The group-lasso for generalized linear models: uniqueness of solutions and efficient algorithms. In Proceedings of the 25th international conference on Machine learning, pages 848–855. Schultheiss et al., (2021) Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Schultheiss, C., Renaux, C., and Bühlmann, P. (2021). Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Multicarving for high-dimensional post-selection inference. Electronic Journal of Statistics, 15(1):1695 – 1742. Stasinopoulos et al., (2020) Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Stasinopoulos, M. D., Rigby, R. A., Heller, G. Z., Voudouris, V., and Bastiani, F. D. (2020). Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Flexible Regression and Smoothing. Using GAMLSS in R. Chapman & Hall. Sur and Candès, (2019) Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Sur, P. and Candès, E. J. (2019). A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- A modern maximum-likelihood theory for high-dimensional logistic regression. Proceedings of the National Academy of Sciences, 116(29):14516–14525. Taylor and Tibshirani, (2018) Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Taylor, J. and Tibshirani, R. (2018). Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Post-selection inference for ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-penalized likelihood models. Canadian Journal of Statistics, 46(1):41–61. Tian and Taylor, (2018) Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Tian, X. and Taylor, J. (2018). Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Selective inference with a randomized response. The Annals of Statistics, 46(2):679–710. van de Geer and Mueller, (2012) van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. van de Geer, S. and Mueller, P. (2012). Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Quasi-likelihood and/or robust estimation in high dimensions. Statistical science, 27(4):469–480. White, (1994) White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- White, H. (1994). Estimation, Inference and Specification Analysis. Cambridge University Press, Cambridge. Yuan and Lin, (2006) Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68:49–67. Zhang et al., (2023) Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhang, Z., Lee, S., and Dobriban, E. (2023). A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- A framework for statistical inference via randomized algorithms. arXiv preprint arXiv:2307.11255. Zhou and Claeskens, (2024) Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428. Zhou, J. and Claeskens, G. (2024). A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.
- A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression. Electronic Journal of Statistics, 18:395–428.