Academic literature on the topic 'And LASSO regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'And LASSO regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "And LASSO regression"

1

Ranstam, J., and J. A. Cook. "LASSO regression." British Journal of Surgery 105, no. 10 (2018): 1348. http://dx.doi.org/10.1002/bjs.10895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hans, C. "Bayesian lasso regression." Biometrika 96, no. 4 (2009): 835–45. http://dx.doi.org/10.1093/biomet/asp047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ahrens, Achim, Christian B. Hansen, and Mark E. Schaffer. "lassopack: Model selection and prediction with regularized regression in Stata." Stata Journal: Promoting communications on statistics and Stata 20, no. 1 (2020): 176–235. http://dx.doi.org/10.1177/1536867x20909697.

Full text
Abstract:
In this article, we introduce lassopack, a suite of programs for regularized regression in Stata. lassopack implements lasso, square-root lasso, elastic net, ridge regression, adaptive lasso, and postestimation ordinary least squares. The methods are suitable for the high-dimensional setting, where the number of predictors p may be large and possibly greater than the number of observations, n. We offer three approaches for selecting the penalization (“tuning”) parameters: information criteria (implemented in lasso2), K-fold cross-validation and h-step-ahead rolling cross-validation for cross-s
APA, Harvard, Vancouver, ISO, and other styles
4

Kadhim Abbas, Haider. "Bayesian Lasso Tobit regression." Journal of Al-Qadisiyah for computer science and mathematics 11, no. 2 (2019): 1–13. http://dx.doi.org/10.29304/jqcm.2019.11.2.553.

Full text
Abstract:
In the present research, we have proposed a new approach for model selection in Tobit regression. The new technique uses Bayesian Lasso in Tobit regression (BLTR). It has many features that give optimum estimation and variable selection property. Specifically, we introduced a new hierarchal model. Then, a new Gibbs sampler is introduced.We also extend the new approach by adding the ridge parameter inside the variance covariance matrix to avoid the singularity in the case of multicollinearity or in case the number of predictors greater than the number of observations. A comparison was made with
APA, Harvard, Vancouver, ISO, and other styles
5

Xu, Huan, Constantine Caramanis, and Shie Mannor. "Robust Regression and Lasso." IEEE Transactions on Information Theory 56, no. 7 (2010): 3561–74. http://dx.doi.org/10.1109/tit.2010.2048503.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Neykov, Matey. "Isotonic regression meets LASSO." Electronic Journal of Statistics 13, no. 1 (2019): 710–46. http://dx.doi.org/10.1214/19-ejs1537.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hafsa, Fathima, Juveria Soha, Fathima Rida, and Hifsa Naaz Syeda. "An Analysis of Car Price Prediction Using Machine Learning." Research and Reviews: Advancement in Cyber Security 2, no. 2 (2025): 33–40. https://doi.org/10.5281/zenodo.15308198.

Full text
Abstract:
<em>Car price prediction is a critical task in the automotive industry, enabling buyers, sellers, and financial institutions to make informed and objective decisions. This research focuses on applying machine learning techniques, specifically Linear Regression and Lasso Regression to predict used car prices based on multiple factors including fuel type, transmission, seller type, vehicle age, and kilometers driven. The dataset was carefully preprocessed to handle missing values and encode categorical variables, ensuring the data was suitable for model training. Both models were evaluated using
APA, Harvard, Vancouver, ISO, and other styles
8

Alhamzawi, Rahim, and Keming Yu. "Bayesian Lasso-mixed quantile regression." Journal of Statistical Computation and Simulation 84, no. 4 (2012): 868–80. http://dx.doi.org/10.1080/00949655.2012.731689.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alhamzawi, Ahmed. "Tobit regression with Lasso penalty." Journal of Physics: Conference Series 1664 (November 2020): 012046. http://dx.doi.org/10.1088/1742-6596/1664/1/012046.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alhamzawi, Rahim, and Haithem Taha Mohammad Ali. "The Bayesian adaptive lasso regression." Mathematical Biosciences 303 (September 2018): 75–82. http://dx.doi.org/10.1016/j.mbs.2018.06.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "And LASSO regression"

1

Mak, Carmen. "Polychotomous logistic regression via the Lasso." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0004/NQ41227.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Olaya, Bucaro Orlando. "Predicting risk of cyberbullying victimization using lasso regression." Thesis, Uppsala universitet, Statistiska institutionen, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-338767.

Full text
Abstract:
The increased online presence and use of technology by today’s adolescents has created new places where bullying can occur. The aim of this thesis is to specify a prediction model that can accurately predict the risk of cyberbullying victimization. The data used is from a survey conducted at five secondary schools in Pereira, Colombia. A logistic regression model with random effects is used to predict cyberbullying exposure. Predictors are selected by lasso, tuned by cross-validation. Covariates included in the study includes demographic variables, dietary habit variables, parental mediation v
APA, Harvard, Vancouver, ISO, and other styles
3

Patnaik, Kaushik. "Adaptive learning in lasso models." Thesis, Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/54353.

Full text
Abstract:
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern (also known as model selection) in linear models from observations contaminated by noise. We examine a scenario where a fraction of the zero co-variates are highly correlated with non-zero co-variates making sparsity recovery difficult. We propose two methods that adaptively increment the regularization parameter to prune the Lasso solution set. We prove that the algorithms achieve consistent model selection with high probability while using fewer samples than traditional Lasso. The algorithm c
APA, Harvard, Vancouver, ISO, and other styles
4

Caster, Ola. "Mining the WHO Drug Safety Database Using Lasso Logistic Regression." Thesis, Uppsala University, Department of Mathematics, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-120981.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Xiaohui. "Lasso-type sparse regression and high-dimensional Gaussian graphical models." Thesis, University of British Columbia, 2012. http://hdl.handle.net/2429/42271.

Full text
Abstract:
High-dimensional datasets, where the number of measured variables is larger than the sample size, are not uncommon in modern real-world applications such as functional Magnetic Resonance Imaging (fMRI) data. Conventional statistical signal processing tools and mathematical models could fail at handling those datasets. Therefore, developing statistically valid models and computationally efficient algorithms for high-dimensional situations are of great importance in tackling practical and scientific problems. This thesis mainly focuses on the following two issues: (1) recovery of sparse regressi
APA, Harvard, Vancouver, ISO, and other styles
6

He, Shiquan. "A Review of Linear Regression and some Basic Proofs for Lasso." Digital WPI, 2010. https://digitalcommons.wpi.edu/etd-theses/88.

Full text
Abstract:
The goal of this paper is to do some basic proofs for lasso and have a deep understanding of linear regression. In this paper, firstly I give a review of methods in linear regression, and most concerns with the method of lasso. Lasso for ¡®least absolute shrinkage and selection operator¡¯ is a regularized version of method adds a constraint which uses norm less or equal to a given value t. By doing so, some predictor coefficients would be shrank and some others might be set to 0. We can attain good interpretation and prediction accuracy by using lasso method. Secondly, I provide some basic pro
APA, Harvard, Vancouver, ISO, and other styles
7

Mahmood, Nozad. "Sparse Ridge Fusion For Linear Regression." Master's thesis, University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5986.

Full text
Abstract:
For a linear regression, the traditional technique deals with a case where the number of observations n more than the number of predictor variables p (n>p). In the case n<p, the classical method fails to estimate the coefficients. A solution of this problem in the case of correlated predictors is provided in this thesis. A new regularization and variable selection is proposed under the name of Sparse Ridge Fusion (SRF). In the case of highly correlated predictor , the simulated examples and a real data show that the SRF always outperforms the lasso, elastic net, and the S-Lasso, and the result
APA, Harvard, Vancouver, ISO, and other styles
8

Mo, Lili. "A class of operator splitting methods for least absolute shrinkage and selection operator (LASSO) models." HKBU Institutional Repository, 2012. https://repository.hkbu.edu.hk/etd_ra/1391.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hashem, Hussein Abdulahman. "Regularized and robust regression methods for high dimensional data." Thesis, Brunel University, 2014. http://bura.brunel.ac.uk/handle/2438/9197.

Full text
Abstract:
Recently, variable selection in high-dimensional data has attracted much research interest. Classical stepwise subset selection methods are widely used in practice, but when the number of predictors is large these methods are difficult to implement. In these cases, modern regularization methods have become a popular choice as they perform variable selection and parameter estimation simultaneously. However, the estimation procedure becomes more difficult and challenging when the data suffer from outliers or when the assumption of normality is violated such as in the case of heavy-tailed errors.
APA, Harvard, Vancouver, ISO, and other styles
10

Al-Kenani, Ali J. Kadhim. "Some statistical methods for dimension reduction." Thesis, Brunel University, 2013. http://bura.brunel.ac.uk/handle/2438/7727.

Full text
Abstract:
The aim of the work in this thesis is to carry out dimension reduction (DR) for high dimensional (HD) data by using statistical methods for variable selection, feature extraction and a combination of the two. In Chapter 2, the DR is carried out through robust feature extraction. Robust canonical correlation (RCCA) methods have been proposed. In the correlation matrix of canonical correlation analysis (CCA), we suggest that the Pearson correlation should be substituted by robust correlation measures in order to obtain robust correlation matrices. These matrices have been employed for producing
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "And LASSO regression"

1

Mak, Carmen. Polychotomous logistic regression via the Lasso. 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chan-Lau, Jorge A. Lasso Regressions and Forecasting Models in Applied Stress Testing. International Monetary Fund, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chan-Lau, Jorge A. Lasso Regressions and Forecasting Models in Applied Stress Testing. International Monetary Fund, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chan-Lau, Jorge A. Lasso Regressions and Forecasting Models in Applied Stress Testing. International Monetary Fund, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "And LASSO regression"

1

Chien, Peter, Xinwei Deng, and Chunfang Devon Lin. "Efficient Experimental Design for Lasso Regression." In Advances and Innovations in Statistics and Data Science. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08329-7_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Montesinos López, Osval Antonio, Abelardo Montesinos López, and Jose Crossa. "Elements for Building Supervised Statistical Machine Learning Models." In Multivariate Statistical Machine Learning Methods for Genomic Prediction. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_3.

Full text
Abstract:
AbstractThis chapter gives details of the linear multiple regression model including assumptions and some pros and cons, the maximum likelihood. Gradient descendent methods are described for learning the parameters under this model. Penalized linear multiple regression is derived under Ridge and Lasso penalties, which also emphasizes the estimation of the regularization parameter of importance for its successful implementation. Examples are given for both penalties (Ridge and Lasso) and but not for penalized regression multiple regression framework for illustrating the circumstances when the p
APA, Harvard, Vancouver, ISO, and other styles
3

Schmidt, Daniel F., and Enes Makalic. "Robust Lasso Regression with Student-t Residuals." In AI 2017: Advances in Artificial Intelligence. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-63004-5_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Roy, Sanjiban Sekhar, Dishant Mittal, Avik Basu, and Ajith Abraham. "Stock Market Forecasting Using LASSO Linear Regression Model." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-13572-4_31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Maciak, Matúš. "Testing Shape Constraints in Lasso Regularized Joinpoint Regression." In Analytical Methods in Statistics. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-51313-3_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Taylan, Pakize, and Gerhard Wilhelm Weber. "CG-Lasso Estimator for Multivariate Adaptive Regression Spline." In Nonlinear Systems and Complexity. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-90972-1_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Feng, Boning Bernice, Avi Giloni, and Jeffrey S. Simonoff. "The Conditional Breakdown Properties of LAD-LASSO Regression." In Statistical Outliers and Related Topics. CRC Press, 2024. http://dx.doi.org/10.1201/9781003379881-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cleophas, Ton J., and Aeilko H. Zwinderman. "Regularized Regression Analysis, Ridge, Lasso, Elastic Net Coefficients." In Application of Regularized Regressions to Identify Novel Predictors in Clinical Research. Springer Nature Switzerland, 2024. https://doi.org/10.1007/978-3-031-72247-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ng, Pei Yeen, Elayaraja Aruchunan, Fumitaka Furuoka, Samsul Ariffin Abdul Karim, Jackel Vui Lung Chew, and Majid Khan Majahar Ali. "Intelligent LASSO Regression Modelling for Seaweed Drying Analysis." In Studies in Systems, Decision and Control. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-67317-7_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yüzbaşı, Bahadır, Syed Ejaz Ahmed, Mohammad Arashi, and Mina Norouzirad. "LAD, LASSO and Related Strategies in Regression Models." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-21248-3_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "And LASSO regression"

1

Fu, Manxia, and Shuisheng Zhou. "Fused Lasso Additive Quantile Regression." In 2023 International Conference on Computer, Internet of Things and Smart City (CIoTSC). IEEE, 2023. http://dx.doi.org/10.1109/ciotsc60428.2023.00040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

kou, xuewei, qingguo du, longting huang, honghai wang, and zhengying li. "Highway vehicle detection based on ridge-LASSO regression." In 4th International Conference on Image Processing and Intelligent Control (IPIC 2024), edited by Kelin Du and Azlan bin Mohd Zain. SPIE, 2024. http://dx.doi.org/10.1117/12.3038530.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Riupassa, Ascendiazorg, Putri Ireine Rambi, Muhammad Amien Ibrahim, and Renaldy Fredyan. "Analyzing Sleep Health from Lifestyle Data Using Lasso Regression." In 2024 International Conference on Information Management and Technology (ICIMTech). IEEE, 2024. https://doi.org/10.1109/icimtech63123.2024.10780829.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dai, Jollen. "Analyzing Factors Influencing Crime Rates in Communities by Lasso Regression." In 2024 IEEE MIT Undergraduate Research Technology Conference (URTC). IEEE, 2024. https://doi.org/10.1109/urtc65039.2024.10937580.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Challapalli, Kausik, Prashanthi Thota, Harsha Garikapati, Veera Manikanta Sai Adusumilli, Satish Anamalamudi, and Reddy Priya Madupuri. "Real Estate Price Prediction: Optimized Ridge and Lasso Regression Analysis." In 2024 OITS International Conference on Information Technology (OCIT). IEEE, 2024. https://doi.org/10.1109/ocit65031.2024.00075.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kumar, Chennupati Charan, and V. Parthipan. "Performance Analysis of Predicting LIC Stock Price using Lasso Regression Compared with Random Forest Regression." In 2024 Second International Conference Computational and Characterization Techniques in Engineering & Sciences (IC3TES). IEEE, 2024. https://doi.org/10.1109/ic3tes62412.2024.10877466.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sun, Xiaojing. "Lasso Regression with Radial Basis Function based Intelligent Music Teaching System." In 2024 First International Conference on Software, Systems and Information Technology (SSITCON). IEEE, 2024. https://doi.org/10.1109/ssitcon62437.2024.10797056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Santhoshkumar, M., and V. Divya. "Fake News Detection Through Feature Weight Optimized Lasso Regression (FWO-LAR)." In 2024 International Conference on Expert Clouds and Applications (ICOECA). IEEE, 2024. http://dx.doi.org/10.1109/icoeca62351.2024.00051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Aelgani, Vivekanand, E. Jothi, Ramy Riad Hussein, Banoth Ramesh, Himanshu Sharma, and Uganya G. "Enhancing Supply Chain Regulatory Compliance with Blockchain and Lasso Regression Integration." In 2024 International Conference on IoT, Communication and Automation Technology (ICICAT). IEEE, 2024. https://doi.org/10.1109/icicat62666.2024.10923268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Verma, Riddhi. "DMT2 Patient Recognition Using Cascaded Back Propagation Neural Network with Lasso Regression." In 2024 9th International Conference on Communication and Electronics Systems (ICCES). IEEE, 2024. https://doi.org/10.1109/icces63552.2024.10859755.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "And LASSO regression"

1

Wang, Lie, Victor Chernozhukov, and Alexandre Belloni. Pivotal estimation via square-root lasso in nonparametric regression. Institute for Fiscal Studies, 2013. http://dx.doi.org/10.1920/wp.cem.2013.6213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chung, Steve, Jaymin Kwon, and Yushin Ahn. Forecasting Commercial Vehicle Miles Traveled (VMT) in Urban California Areas. Mineta Transportation Institute, 2024. http://dx.doi.org/10.31979/mti.2024.2315.

Full text
Abstract:
This study investigates commercial truck vehicle miles traveled (VMT) across six diverse California counties from 2000 to 2020. The counties—Imperial, Los Angeles, Riverside, San Bernardino, San Diego, and San Francisco—represent a broad spectrum of California’s demographics, economies, and landscapes. Using a rich dataset spanning demographics, economics, and pollution variables, we aim to understand the factors influencing commercial VMT. We first visually represent the geographic distribution of the counties, highlighting their unique characteristics. Linear regression models, particularly
APA, Harvard, Vancouver, ISO, and other styles
3

Shin, Youngki, Sokbae (Simon) Lee, and Myung Hwan Seo. The lasso for high-dimensional regression with a possible change-point. Institute for Fiscal Studies, 2014. http://dx.doi.org/10.1920/wp.cem.2014.2614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

de Luis, Mercedes, Emilio Rodríguez, and Diego Torres. Machine learning applied to active fixed-income portfolio management: a Lasso logit approach. Banco de España, 2023. http://dx.doi.org/10.53479/33560.

Full text
Abstract:
The use of quantitative methods constitutes a standard component of the institutional investors’ portfolio management toolkit. In the last decade, several empirical studies have employed probabilistic or classification models to predict stock market excess returns, model bond ratings and default probabilities, as well as to forecast yield curves. To the authors’ knowledge, little research exists into their application to active fixed-income management. This paper contributes to filling this gap by comparing a machine learning algorithm, the Lasso logit regression, with a passive (buy-and-hold)
APA, Harvard, Vancouver, ISO, and other styles
5

Rossi, Jose Luiz, Carlos Piccioni, Marina Rossi, and Daniel Cuajeiro. Brazilian Exchange Rate Forecasting in High Frequency. Inter-American Development Bank, 2022. http://dx.doi.org/10.18235/0004488.

Full text
Abstract:
We investigated the predictability of the Brazilian exchange rate at High Frequency (1, 5 and 15 minutes), using local and global economic variables as predictors. In addition to the Linear Regression method, we use Machine Learning algorithms such as Ridge, Lasso, Elastic Net, Random Forest and Gradient Boosting. When considering contemporary predictors, it is possible to outperform the Random Walk at all frequencies, with local economic variables having greater predictive power than global ones. Machine Learning methods are also capable of reducing the mean squared error. When we consider on
APA, Harvard, Vancouver, ISO, and other styles
6

Roldán-Ferrín, Felipe, and Julián A. Parra-Polania. ENHANCING INFLATION NOWCASTING WITH ONLINE SEARCH DATA: A RANDOM FOREST APPLICATION FOR COLOMBIA. Banco de la República, 2025. https://doi.org/10.32468/be.1318.

Full text
Abstract:
This paper evaluates the predictive capacity of a machine learning model based on Random Forests (RF), combined with Google Trends (GT) data, for nowcasting monthly inflation in Colombia. The proposed RF-GT model is trained using historical inflation data, macroeconomic indicators, and internet search activity. After optimizing the model’s hyperparameters through time series cross-validation, we assess its out-of-sample performance over the period 2023–2024. The results are benchmarked against traditional approaches, including SARIMA, Ridge, and Lasso regressions, as well as professional forec
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!