Academic literature on the topic 'High-Dimensional Regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'High-Dimensional Regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "High-Dimensional Regression"

1

Zheng, Qi, Limin Peng, and Xuming He. "High dimensional censored quantile regression." Annals of Statistics 46, no. 1 (February 2018): 308–43. http://dx.doi.org/10.1214/17-aos1551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Izbicki, Rafael, and Ann B. Lee. "Converting high-dimensional regression to high-dimensional conditional density estimation." Electronic Journal of Statistics 11, no. 2 (2017): 2800–2831. http://dx.doi.org/10.1214/17-ejs1302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lan, Wei, Hansheng Wang, and Chih-Ling Tsai. "Testing covariates in high-dimensional regression." Annals of the Institute of Statistical Mathematics 66, no. 2 (June 18, 2013): 279–301. http://dx.doi.org/10.1007/s10463-013-0414-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Meinshausen, Nicolai, Lukas Meier, and Peter Bühlmann. "p-Values for High-Dimensional Regression." Journal of the American Statistical Association 104, no. 488 (December 2009): 1671–81. http://dx.doi.org/10.1198/jasa.2009.tm08647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Ker-Chau. "Nonlinear confounding in high-dimensional regression." Annals of Statistics 25, no. 2 (April 1997): 577–612. http://dx.doi.org/10.1214/aos/1031833665.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lin, Wei, and Jinchi Lv. "High-Dimensional Sparse Additive Hazards Regression." Journal of the American Statistical Association 108, no. 501 (March 2013): 247–64. http://dx.doi.org/10.1080/01621459.2012.746068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Giraud, Christophe, Sylvie Huet, and Nicolas Verzelen. "High-Dimensional Regression with Unknown Variance." Statistical Science 27, no. 4 (November 2012): 500–518. http://dx.doi.org/10.1214/12-sts398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sun, Qiang, Hongtu Zhu, Yufeng Liu, and Joseph G. Ibrahim. "SPReM: Sparse Projection Regression Model For High-Dimensional Linear Regression." Journal of the American Statistical Association 110, no. 509 (January 2, 2015): 289–302. http://dx.doi.org/10.1080/01621459.2014.892008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Siyang, and Hengjian Cui. "GeneralizedFtest for high dimensional linear regression coefficients." Journal of Multivariate Analysis 117 (May 2013): 134–49. http://dx.doi.org/10.1016/j.jmva.2013.02.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Shen, Xiaotong, Wei Pan, Yunzhang Zhu, and Hui Zhou. "On constrained and regularized high-dimensional regression." Annals of the Institute of Statistical Mathematics 65, no. 5 (January 12, 2013): 807–32. http://dx.doi.org/10.1007/s10463-012-0396-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "High-Dimensional Regression"

1

Fang, Zhou. "Reweighting methods in high dimensional regression." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:26f8541a-9e2d-466a-84aa-e6850c4baba9.

Full text
Abstract:
In this thesis, we focus on the application of covariate reweighting with Lasso-style methods for regression in high dimensions, particularly where p ≥ n. We apply a particular focus to the case of sparse regression under a-priori grouping structures. In such problems, even in the linear case, accurate estimation is difficult. Various authors have suggested ideas such as the Group Lasso and the Sparse Group Lasso, based on convex penalties, or alternatively methods like the Group Bridge, which rely on convergence under repetition to some local minimum of a concave penalised likelihood. We propose in this thesis a methodology that uses concave penalties to inspire a procedure whereupon we compute weights from an initial estimate, and then do a single second reweighted Lasso. This procedure -- the Co-adaptive Lasso -- obtains excellent results in empirical experiments, and we present some theoretical prediction and estimation error bounds. Further, several extensions and variants of the procedure are discussed and studied. In particular, we propose a Lasso style method of doing additive isotonic regression in high dimensions, the Liso algorithm, and enhance it using the Co-adaptive methodology. We also propose a method of producing rules based regression estimates for high dimensional non-parametric regression, that often outperforms the current leading method, the RuleFit algorithm. We also discuss extensions involving robust statistics applied to weight computation, repeating the algorithm, and online computation.
APA, Harvard, Vancouver, ISO, and other styles
2

Meier, Lukas Dieter. "High-dimensional regression problems with special structure /." Zürich : ETH, 2008. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=18129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hashem, Hussein Abdulahman. "Regularized and robust regression methods for high dimensional data." Thesis, Brunel University, 2014. http://bura.brunel.ac.uk/handle/2438/9197.

Full text
Abstract:
Recently, variable selection in high-dimensional data has attracted much research interest. Classical stepwise subset selection methods are widely used in practice, but when the number of predictors is large these methods are difficult to implement. In these cases, modern regularization methods have become a popular choice as they perform variable selection and parameter estimation simultaneously. However, the estimation procedure becomes more difficult and challenging when the data suffer from outliers or when the assumption of normality is violated such as in the case of heavy-tailed errors. In these cases, quantile regression is the most appropriate method to use. In this thesis we combine these two classical approaches together to produce regularized quantile regression methods. Chapter 2 shows a comparative simulation study of regularized and robust regression methods when the response variable is continuous. In chapter 3, we develop a quantile regression model with a group lasso penalty for binary response data when the predictors have a grouped structure and when the data suffer from outliers. In chapter 4, we extend this method to the case of censored response variables. Numerical examples on simulated and real data are used to evaluate the performance of the proposed methods in comparisons with other existing methods.
APA, Harvard, Vancouver, ISO, and other styles
4

Aldahmani, Saeed. "High-dimensional linear regression problems via graphical models." Thesis, University of Essex, 2017. http://repository.essex.ac.uk/19207/.

Full text
Abstract:
This thesis introduces a new method for solving the linear regression problem where the number of observations n is smaller than the number of variables (predictors) v. In contrast to existing methods such as ridge regression, Lasso and Lars, the proposed method uses the idea of graphical models and provides unbiased parameter estimates under certain conditions. In addition, the new method provides a detailed graphical conditional correlation structure for the predictors, whereby the real causal relationship between predictors can be identified. Furthermore, the proposed method is extended to form a hybridisation with the idea of ridge regression to improve efficiency in terms of computation and model selection. In the extended method, less important variables are regularised by a ridge type penalty, and a search for models in the space is made for important covariates. This significantly reduces computational cost while giving unbiased estimates for the important variables as well as increasing the efficiency of model selection. Moreover, the extended method is used in dealing with the issue of portfolio selection within the Markowitz mean-variance framework, with n < v. Various simulations and real data analyses were conducted for comparison between the two novel methods and the aforementioned existing methods. Our experiments indicate that the new methods outperform all the other methods when n
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Tao. "Variable selection and dimension reduction in high-dimensional regression." HKBU Institutional Repository, 2013. http://repository.hkbu.edu.hk/etd_ra/1544.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lee, Wai Hong. "Variable selection for high dimensional transformation model." HKBU Institutional Repository, 2010. http://repository.hkbu.edu.hk/etd_ra/1161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Xiaohui. "Lasso-type sparse regression and high-dimensional Gaussian graphical models." Thesis, University of British Columbia, 2012. http://hdl.handle.net/2429/42271.

Full text
Abstract:
High-dimensional datasets, where the number of measured variables is larger than the sample size, are not uncommon in modern real-world applications such as functional Magnetic Resonance Imaging (fMRI) data. Conventional statistical signal processing tools and mathematical models could fail at handling those datasets. Therefore, developing statistically valid models and computationally efficient algorithms for high-dimensional situations are of great importance in tackling practical and scientific problems. This thesis mainly focuses on the following two issues: (1) recovery of sparse regression coefficients in linear systems; (2) estimation of high-dimensional covariance matrix and its inverse matrix, both subject to additional random noise. In the first part, we focus on the Lasso-type sparse linear regression. We propose two improved versions of the Lasso estimator when the signal-to-noise ratio is low: (i) to leverage adaptive robust loss functions; (ii) to adopt a fully Bayesian modeling framework. In solution (i), we propose a robust Lasso with convex combined loss function and study its asymptotic behaviors. We further extend the asymptotic analysis to the Huberized Lasso, which is shown to be consistent even if the noise distribution is Cauchy. In solution (ii), we propose a fully Bayesian Lasso by unifying discrete prior on model size and continuous prior on regression coefficients in a single modeling framework. Since the proposed Bayesian Lasso has variable model sizes, we propose a reversible-jump MCMC algorithm to obtain its numeric estimates. In the second part, we focus on the estimation of large covariance and precision matrices. In high-dimensional situations, the sample covariance is an inconsistent estimator. To address this concern, regularized estimation is needed. For the covariance matrix estimation, we propose a shrinkage-to-tapering estimator and show that it has attractive theoretic properties for estimating general and large covariance matrices. For the precision matrix estimation, we propose a computationally efficient algorithm that is based on the thresholding operator and Neumann series expansion. We prove that, the proposed estimator is consistent in several senses under the spectral norm. Moreover, we show that the proposed estimator is minimax in a class of precision matrices that are approximately inversely closed.
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Chi. "Variable selection in high dimensional semi-varying coefficient models." HKBU Institutional Repository, 2013. https://repository.hkbu.edu.hk/etd_oa/11.

Full text
Abstract:
With the development of computing and sampling technologies, high dimensionality has become an important characteristic of commonly used science data, such as some data from bioinformatics, information engineering, and the social sciences. The varying coefficient model is a flexible and powerful statistical model for exploring dynamic patterns in many scientific areas. It is a natural extension of classical parametric models with good interpretability, and is becoming increasingly popular in data analysis. The main objective of thesis is to apply the varying coefficient model to analyze high dimensional data, and to investigate the properties of regularization methods for high-dimensional varying coefficient models. We first discuss how to apply local polynomial smoothing and the smoothly clipped absolute deviation (SCAD) penalized methods to estimate varying coefficient models when the dimension of the model is diverging with the sample size. Based on the nonconcave penalized method and local polynomial smoothing, we suggest a regularization method to select significant variables from the model and estimate the corresponding coefficient functions simultaneously. Importantly, our proposed method can also identify constant coefficients at same time. We investigate the asymptotic properties of our proposed method and show that it has the so called “oracle property.” We apply the nonparametric independence Screening (NIS) method to varying coefficient models with ultra-high-dimensional data. Based on the marginal varying coefficient model estimation, we establish the sure independent screening property under some regular conditions for our proposed sure screening method. Combined with our proposed regularization method, we can systematically deal with high-dimensional or ultra-high-dimensional data using varying coefficient models. The nonconcave penalized method is a very effective variable selection method. However, maximizing such a penalized likelihood function is computationally challenging, because the objective functions are nondifferentiable and nonconcave. The local linear approximation (LLA) and local quadratic approximation (LQA) are two popular algorithms for dealing with such optimal problems. In this thesis, we revisit these two algorithms. We investigate the convergence rate of LLA and show that the rate is linear. We also study the statistical properties of the one-step estimate based on LLA under a generalized statistical model with a diverging number of dimensions. We suggest a modified version of LQA to overcome its drawback under high dimensional models. Our proposed method avoids having to calculate the inverse of the Hessian matrix in the modified Newton Raphson algorithm based on LQA. Our proposed methods are investigated by numerical studies and in a real case study in Chapter 5.
APA, Harvard, Vancouver, ISO, and other styles
9

Breheny, Patrick John Huang Jian. "Regularized methods for high-dimensional and bi-level variable selection." Iowa City : University of Iowa, 2009. http://ir.uiowa.edu/etd/325.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Villegas, Santamaría Mauricio. "Contributions to High-Dimensional Pattern Recognition." Doctoral thesis, Universitat Politècnica de València, 2011. http://hdl.handle.net/10251/10939.

Full text
Abstract:
This thesis gathers some contributions to statistical pattern recognition particularly targeted at problems in which the feature vectors are high-dimensional. Three pattern recognition scenarios are addressed, namely pattern classification, regression analysis and score fusion. For each of these, an algorithm for learning a statistical model is presented. In order to address the difficulty that is encountered when the feature vectors are high-dimensional, adequate models and objective functions are defined. The strategy of learning simultaneously a dimensionality reduction function and the pattern recognition model parameters is shown to be quite effective, making it possible to learn the model without discarding any discriminative information. Another topic that is addressed in the thesis is the use of tangent vectors as a way to take better advantage of the available training data. Using this idea, two popular discriminative dimensionality reduction techniques are shown to be effectively improved. For each of the algorithms proposed throughout the thesis, several data sets are used to illustrate the properties and the performance of the approaches. The empirical results show that the proposed techniques perform considerably well, and furthermore the models learned tend to be very computationally efficient.
Villegas Santamaría, M. (2011). Contributions to High-Dimensional Pattern Recognition [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/10939
Palancia
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "High-Dimensional Regression"

1

Chernozhukov, Victor. L1-Penalized Quantile Regression in High Dimensional Sparse Models. Cambridge, MA: Massachusetts Institute of Technology, Dept. of Economics, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Belloni, Alexandre. Post-[script l]\2081-penalized estimators in high-dimensional linear regression models. Cambridge, MA: Massachusetts Institute of Technology, Dept. of Economics, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Papay, John P. High-school exit examinations and the schooling decisions of teenagers: A multi-dimensional regression-discontinuity analysis. Cambridge, MA: National Bureau of Economic Research, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ahmed, S. E. (Syed Ejaz), 1957- editor of compilation, ed. Perspectives on big data analysis: Methodologies and applications : International Workshop on Perspectives on High-Dimensional Data Anlaysis II, May 30-June 1, 2012, Centre de Recherches Mathématiques, University de Montréal, Montréal, Québec, Canada. Providence, Rhode Island: American Mathematical Society, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Longhai. Bayesian classification and regression with high dimensional features. 2007, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

High Dimensional Econometrics and Identification. ©2019: World Scientific Publishing Co. Pvt. Ltd., 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yanagihara, Hirokazu. Consistency of an Information Criterion for High-Dimensional Multivariate Regression. Springer, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Large Sample Covariance Matrices and High-Dimensional Data Analysis. Cambridge University Press, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "High-Dimensional Regression"

1

Giraud, Christophe. "Multivariate Regression." In Introduction to High-Dimensional Statistics, 159–78. 2nd ed. Boca Raton: Chapman and Hall/CRC, 2021. http://dx.doi.org/10.1201/9781003158745-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kooperberg, Charles, and Michael LeBlanc. "Multivariate Nonparametric Regression." In High-Dimensional Data Analysis in Cancer Research, 1–24. New York, NY: Springer New York, 2008. http://dx.doi.org/10.1007/978-0-387-69765-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Barber, Rina Foygel, Mathias Drton, and Kean Ming Tan. "Laplace Approximation in High-Dimensional Bayesian Regression." In Statistical Analysis for High-Dimensional Data, 15–36. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-27099-9_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

van de Geer, Sara, and Benjamin Stucky. "χ 2-Confidence Sets in High-Dimensional Regression." In Statistical Analysis for High-Dimensional Data, 279–306. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-27099-9_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Abramovich, Felix, and Vadim Grinshtein. "Model Selection in Gaussian Regression for High-Dimensional Data." In Inverse Problems and High-Dimensional Estimation, 159–70. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-19989-9_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Harezlak, Jaroslaw, Eric Tchetgen, and Xiaochun Li. "Variable selection in regression - estimation, prediction,sparsity, inference." In High-Dimensional Data Analysis in Cancer Research, 1–21. New York, NY: Springer New York, 2008. http://dx.doi.org/10.1007/978-0-387-69765-9_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kalina, Jan, and Petra Vidnerová. "On Robust Training of Regression Neural Networks." In Functional and High-Dimensional Statistics and Related Fields, 145–52. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-47756-1_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

McConaghy, Trent. "Latent Variable Symbolic Regression for High-Dimensional Inputs." In Genetic Programming Theory and Practice VII, 103–18. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-1-4419-1626-6_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Reangsephet, Orawan, Supranee Lisawadi, and Syed Ejaz Ahmed. "Weak Signals in High-Dimensional Logistic Regression Models." In Advances in Intelligent Systems and Computing, 121–33. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-21248-3_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Becker, C., and R. Fried. "Sliced Inverse Regression for High-dimensional Time Series." In Studies in Classification, Data Analysis, and Knowledge Organization, 3–11. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-642-55721-7_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "High-Dimensional Regression"

1

Kuleshov, Alexander, and Alexander Bernstein. "Regression on High-Dimensional Inputs." In 2016 IEEE 16th International Conference on Data Mining Workshops (ICDMW). IEEE, 2016. http://dx.doi.org/10.1109/icdmw.2016.0108.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yoo, Youngjoon, Sangdoo Yun, Hyung Jin Chang, Yiannis Demiris, and Jin Young Choi. "Variational Autoencoded Regression: High Dimensional Regression of Visual Data on Complex Manifold." In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2017. http://dx.doi.org/10.1109/cvpr.2017.314.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Obozinski, Guillaume, Martin J. Wainwright, and Michael I. Jordan. "Union support recovery in high-dimensional multivariate regression." In 2008 46th Annual Allerton Conference on Communication, Control, and Computing. IEEE, 2008. http://dx.doi.org/10.1109/allerton.2008.4797530.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nurunnabi, A. A. M., and Mohammed Nasser. "Regression diagnostics in large and high dimensional data." In 2008 11th International Conference on Computer and Information Technology (ICCIT). IEEE, 2008. http://dx.doi.org/10.1109/iccitechn.2008.4802969.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Drouard, Vincent, Sileye Ba, Georgios Evangelidis, Antoine Deleforge, and Radu Horaud. "Head pose estimation via probabilistic high-dimensional regression." In 2015 IEEE International Conference on Image Processing (ICIP). IEEE, 2015. http://dx.doi.org/10.1109/icip.2015.7351683.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Yan, Kevin S. Xu, and Chandan K. Reddy. "Regularized Parametric Regression for High-dimensional Survival Analysis." In Proceedings of the 2016 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2016. http://dx.doi.org/10.1137/1.9781611974348.86.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yuzbasi, Bahadir, S. Ejaz Ahmed, and Yasin Asar. "L1 Correlation-Based Penalty in High-Dimensional Quantile Regression." In 2018 4th International Conference on Big Data and Information Analytics (BigDIA). IEEE, 2018. http://dx.doi.org/10.1109/bigdia.2018.8632795.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Raytchev, B., Y. Katamoto, M. Koujiba, T. Tamaki, and K. Kaneda. "Ensemble-based local learning for high-dimensional data regression." In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7900033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tung, Nguyen Thanh, Joshua Zhexue Huang, Thuy Thi Nguyen, and Imran Khan. "Bias-corrected Quantile Regression Forests for high-dimensional data." In 2014 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2014. http://dx.doi.org/10.1109/icmlc.2014.7009082.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Salemi, Peter, Barry L. Nelson, and Jeremy Staum. "Moving Least Squares regression for high dimensional simulation metamodeling." In 2012 Winter Simulation Conference - (WSC 2012). IEEE, 2012. http://dx.doi.org/10.1109/wsc.2012.6465122.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "High-Dimensional Regression"

1

Obozinski, Guillaume, Martin J. Wainwright, and Michael I. Jordan. Union Support Recovery in High-Dimensional Multivariate Regression. Fort Belvoir, VA: Defense Technical Information Center, August 2008. http://dx.doi.org/10.21236/ada487461.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chernozhukov, Victor, and Alexandre Belloni. L1-Penalized quantile regression in high-dimensional sparse models. Institute for Fiscal Studies, May 2009. http://dx.doi.org/10.1920/wp.cem.2009.1009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chernozhukov, Victor, and Alexandre Belloni. Post-l1-penalized estimators in high-dimensional linear regression models. Institute for Fiscal Studies, June 2010. http://dx.doi.org/10.1920/wp.cem.2010.1310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Belloni, Alexandre, Victor Chernozhukov, and Kengo Kato. Robust inference in high-dimensional approximately sparse quantile regression models. IFS, December 2013. http://dx.doi.org/10.1920/wp.cem.2013.7013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shin, Youngki, Sokbae (Simon) Lee, and Myung Hwan Seo. The lasso for high-dimensional regression with a possible change-point. Institute for Fiscal Studies, May 2014. http://dx.doi.org/10.1920/wp.cem.2014.2614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chernozhukov, Victor, Kengo Kato, and Alexandre Belloni. Valid post-selection inference in high-dimensional approximately sparse quantile regression models. IFS, December 2014. http://dx.doi.org/10.1920/wp.cem.2014.5314.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Papay, John, John Willett, and Richard Murnane. High-School Exit Examinations and the Schooling Decisions of Teenagers: A Multi-Dimensional Regression-Discontinuity Analysis. Cambridge, MA: National Bureau of Economic Research, June 2011. http://dx.doi.org/10.3386/w17112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!