Academic literature on the topic 'Lasso feature selection'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Lasso feature selection.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Lasso feature selection"

1

Muthukrishnan, R., and C. K. James. "The Effect of Multicollinearity on Feature Selection." Indian Journal Of Science And Technology 17, no. 35 (2024): 3664–68. http://dx.doi.org/10.17485/ijst/v17i35.1876.

Full text
Abstract:
Objectives: To provide a new LASSO-based feature selection technique that aids in selecting important variables for predicting the response variables in case of multicollinearity. Methods: LASSO is a type of regression method employed to select important covariates for predicting a dependent variable. The traditional LASSO method uses the conventional Ordinary Least Square (OLS) method for this purpose. The Use of the OLS based LASSO approach gives unreliable results if the data deviates from normality. Thus, this study recommends using, a Redescending M-estimator-based LASSO approach. The eff
APA, Harvard, Vancouver, ISO, and other styles
2

R, Muthukrishnan, and K. James C. "The Effect of Multicollinearity on Feature Selection." Indian Journal of Science and Technology 17, no. 35 (2024): 3664–68. https://doi.org/10.17485/IJST/v17i35.1876.

Full text
Abstract:
Abstract <strong>Objectives:</strong>&nbsp;To provide a new LASSO-based feature selection technique that aids in selecting important variables for predicting the response variables in case of multicollinearity.&nbsp;<strong>Methods:</strong>&nbsp;LASSO is a type of regression method employed to select important covariates for predicting a dependent variable. The traditional LASSO method uses the conventional Ordinary Least Square (OLS) method for this purpose. The Use of the OLS based LASSO approach gives unreliable results if the data deviates from normality. Thus, this study recommends using
APA, Harvard, Vancouver, ISO, and other styles
3

Jain, Rahi, and Wei Xu. "HDSI: High dimensional selection with interactions algorithm on feature selection and testing." PLOS ONE 16, no. 2 (2021): e0246159. http://dx.doi.org/10.1371/journal.pone.0246159.

Full text
Abstract:
Feature selection on high dimensional data along with the interaction effects is a critical challenge for classical statistical learning techniques. Existing feature selection algorithms such as random LASSO leverages LASSO capability to handle high dimensional data. However, the technique has two main limitations, namely the inability to consider interaction terms and the lack of a statistical test for determining the significance of selected features. This study proposes a High Dimensional Selection with Interactions (HDSI) algorithm, a new feature selection method, which can handle high-dim
APA, Harvard, Vancouver, ISO, and other styles
4

Yamada, Makoto, Wittawat Jitkrittum, Leonid Sigal, Eric P. Xing, and Masashi Sugiyama. "High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso." Neural Computation 26, no. 1 (2014): 185–207. http://dx.doi.org/10.1162/neco_a_00537.

Full text
Abstract:
The goal of supervised feature selection is to find a subset of input features that are responsible for predicting output values. The least absolute shrinkage and selection operator (Lasso) allows computationally efficient feature selection based on linear dependency between input features and output values. In this letter, we consider a feature-wise kernelized Lasso for capturing nonlinear input-output dependency. We first show that with particular choices of kernel functions, nonredundant features with strong statistical dependence on output values can be found in terms of kernel-based indep
APA, Harvard, Vancouver, ISO, and other styles
5

K, Emily Esther Rani, and Baulkani S. "Multi Variate Feature Extraction and Feature Selection using LGKFS Algorithm for Detecting Alzheimer's Disease." Indian Journal of Science and Technology 16, no. 22 (2023): 1665–75. https://doi.org/10.17485/IJST/v16i22.707.

Full text
Abstract:
Abstract <strong>Objectives:</strong>&nbsp;This study focuses on machine learning techniques to classify various stages of Alzheimer&rsquo;s Disease(AD).&nbsp;<strong>Methods:</strong>&nbsp;Absolutely, 1,997 PD weighted Resting State Functional MRI (rsFMRI) images were acquired from ADNI-3 dataset for the classification of AD. First, input rsFMRI images from the dataset were preprocessed and segmented. After segmentation, we have extracted multi variate features. Then, we have proposed Lasso with Graph Kernel Feature Selection (LGKFS) algorithm for selecting the best features. Finally, Radom F
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Qiang, Tingyu Xia, Huiyan Sun, Makoto Yamada, and Yi Chang. "Unsupervised Nonlinear Feature Selection from High-Dimensional Signed Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 4182–89. http://dx.doi.org/10.1609/aaai.v34i04.5839.

Full text
Abstract:
With the rapid development of social media services in recent years, relational data are explosively growing. The signed network, which consists of a mixture of positive and negative links, is an effective way to represent the friendly and hostile relations among nodes, which can represent users or items. Because the features associated with a node of a signed network are usually incomplete, noisy, unlabeled, and high-dimensional, feature selection is an important procedure to eliminate irrelevant features. However, existing network-based feature selection methods are linear methods, which mea
APA, Harvard, Vancouver, ISO, and other styles
7

Mai, Jifang, Shaohua Zhang, Haiqing Zhao, and Lijun Pan. "Factor Investment or Feature Selection Analysis?" Mathematics 13, no. 1 (2024): 9. https://doi.org/10.3390/math13010009.

Full text
Abstract:
This study has made significant findings in A-share market data processing and portfolio management. Firstly, by adopting the Lasso method and CPCA framework, we effectively addressed the problem of multicollinearity among feature indicators, with the Lasso method demonstrating superior performance in handling this issue, thus providing a new method for financial data processing. Secondly, Deep Feedforward Neural Networks (DFN) exhibited exceptional performance in portfolio management, significantly outperforming other evaluated machine learning methods, and achieving high levels of out-of-sam
APA, Harvard, Vancouver, ISO, and other styles
8

Patil, Abhijeet R., and Sangjin Kim. "Combination of Ensembles of Regularized Regression Models with Resampling-Based Lasso Feature Selection in High Dimensional Data." Mathematics 8, no. 1 (2020): 110. http://dx.doi.org/10.3390/math8010110.

Full text
Abstract:
In high-dimensional data, the performances of various classifiers are largely dependent on the selection of important features. Most of the individual classifiers with the existing feature selection (FS) methods do not perform well for highly correlated data. Obtaining important features using the FS method and selecting the best performing classifier is a challenging task in high throughput data. In this article, we propose a combination of resampling-based least absolute shrinkage and selection operator (LASSO) feature selection (RLFS) and ensembles of regularized regression (ERRM) capable o
APA, Harvard, Vancouver, ISO, and other styles
9

Xie, Zongxia, and Yong Xu. "Sparse group LASSO based uncertain feature selection." International Journal of Machine Learning and Cybernetics 5, no. 2 (2013): 201–10. http://dx.doi.org/10.1007/s13042-013-0156-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ming, Di, Chris Ding, and Feiping Nie. "A Probabilistic Derivation of LASSO and L12-Norm Feature Selections." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 4586–93. http://dx.doi.org/10.1609/aaai.v33i01.33014586.

Full text
Abstract:
LASSO and ℓ2,1-norm based feature selection had achieved success in many application areas. In this paper, we first derive LASSO and ℓ1,2-norm feature selection from a probabilistic framework, which provides an independent point of view from the usual sparse coding point of view. From here, we further propose a feature selection approach based on the probability-derived ℓ1,2-norm. We point out some inflexibility in the standard feature selection that the feature selected for all different classes are enforced to be exactly the same using the widely used ℓ2,1-norm, which enforces the joint spar
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Lasso feature selection"

1

Hu, Qing. "Predictor Selection in Linear Regression: L1 regularization of a subset of parameters and Comparison of L1 regularization and stepwise selection." Link to electronic thesis, 2007. http://www.wpi.edu/Pubs/ETD/Available/etd-051107-154052/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ocloo, Isaac Xoese. "Energy Distance Correlation with Extended Bayesian Information Criteria for feature selection in high dimensional models." Bowling Green State University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1625238661031258.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Huynh, Bao Tuyen. "Estimation and feature selection in high-dimensional mixtures-of-experts models." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMC237.

Full text
Abstract:
Cette thèse traite de la modélisation et de l’estimation de modèles de mélanges d’experts de grande dimension, en vue d’efficaces estimation de densité, prédiction et classification de telles données complexes car hétérogènes et de grande dimension. Nous proposons de nouvelles stratégies basées sur l’estimation par maximum de vraisemblance régularisé des modèles pour pallier aux limites des méthodes standards, y compris l’EMV avec les algorithmes d’espérance-maximisation (EM), et pour effectuer simultanément la sélection des variables pertinentes afin d’encourager des solutions parcimonieuses
APA, Harvard, Vancouver, ISO, and other styles
4

Ehrlinger, John M. "Regularization: Stagewise Regression and Bagging." Case Western Reserve University School of Graduate Studies / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=case1300817082.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sanchez, Merchante Luis Francisco. "Learning algorithms for sparse classification." Phd thesis, Université de Technologie de Compiègne, 2013. http://tel.archives-ouvertes.fr/tel-00868847.

Full text
Abstract:
This thesis deals with the development of estimation algorithms with embedded feature selection the context of high dimensional data, in the supervised and unsupervised frameworks. The contributions of this work are materialized by two algorithms, GLOSS for the supervised domain and Mix-GLOSS for unsupervised counterpart. Both algorithms are based on the resolution of optimal scoring regression regularized with a quadratic formulation of the group-Lasso penalty which encourages the removal of uninformative features. The theoretical foundations that prove that a group-Lasso penalized optimal sc
APA, Harvard, Vancouver, ISO, and other styles
6

Peterson, Ryan Andrew. "Ranked sparsity: a regularization framework for selecting features in the presence of prior informational asymmetry." Diss., University of Iowa, 2019. https://ir.uiowa.edu/etd/6834.

Full text
Abstract:
In this dissertation, we explore and illustrate the concept of ranked sparsity, a phenomenon that often occurs naturally in the presence of derived variables. Ranked sparsity arises in modeling applications when an expected disparity exists in the quality of information between different feature sets. Its presence can cause traditional model selection methods to fail because statisticians commonly presume that each potential parameter is equally worthy of entering into the final model - we call this principle "covariate equipoise". However, this presumption does not always hold, especially in
APA, Harvard, Vancouver, ISO, and other styles
7

(5930882), Huiting Su. "OPTIMAL PARAMETER SETTING OF SINGLE AND MULTI-TASK LASSO." Thesis, 2019.

Find full text
Abstract:
This thesis considers the problem of feature selection when the number of predictors is larger than the number of samples. The performance of supersaturated design (SSD) working with least absolute shrinkage and selection operator (LASSO) is studied in this setting. In order to achieve higher feature selection correctness, self-voting LASSO is implemented to select the tuning parameter while approximately optimize the probability of achieving Sign Correctness. Furthermore, we derive the probability of achieving Direction Correctness, and extend the self-voting LASSO to multi-task self-voting L
APA, Harvard, Vancouver, ISO, and other styles
8

Noro, Catarina Vieira. "Determinants of households´ consumption in Portugal - a machine learning approach." Master's thesis, 2021. http://hdl.handle.net/10362/121884.

Full text
Abstract:
Machine Learning has been widely adopted by researchers in several academic fields.Although at a slow pace, the field of economics has also started to acknowledge the pos-sibilities of these algorithm based methods for complementing or even replace traditionalEconometric approaches. This research aims to apply Machine Learning data-driven variable selection models for accessing the determinants of Portuguese households’ consumption using the Household Finance and Consumption Survey. I found that LASSO Regression and Elastic Net have the best performance in this setting and that wealth relate
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Lasso feature selection"

1

Jiang, Bo, Chris Ding, and Bin Luo. "Covariate-Correlated Lasso for Feature Selection." In Machine Learning and Knowledge Discovery in Databases. Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-662-44848-9_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Amaral Santos, Paula L., Sultan Imangaliyev, Klamer Schutte, and Evgeni Levin. "Feature Selection via Co-regularized Sparse-Group Lasso." In Lecture Notes in Computer Science. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-51469-7_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Fuwei, Lifeng Lai, and Shuguang Cui. "On the Adversarial Robustness of LASSO Based Feature Selection." In Machine Learning Algorithms. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16375-3_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Caifa, and Andreas Wieser. "Jaccard Analysis and LASSO-Based Feature Selection for Location Fingerprinting with Limited Computational Complexity." In Lecture Notes in Geoinformation and Cartography. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-71470-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Awasthi, Naimisha, and Prateek Raj Gautam. "Android ransomware network traffic detection using decision tree and L1 LASSO regularization feature selection." In Intelligent Computing and Communication Techniques. CRC Press, 2025. https://doi.org/10.1201/9781003530190-104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Xiangyu, Yanwu Xu, Shuicheng Yan, et al. "Discriminative Feature Selection for Multiple Ocular Diseases Classification by Sparse Induced Graph Regularized Group Lasso." In Lecture Notes in Computer Science. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-24571-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dhumane, Amol V., Priyanka Kaldate, Ankita Sawant, Prajwal Kadam, and Vinay Chopade. "Efficient Prediction of Cardiovascular Disease Using Machine Learning Algorithms with Relief and LASSO Feature Selection Techniques." In International Conference on Innovative Computing and Communications. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3315-0_52.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cruz, Jose, Wilson Mamani, Christian Romero, and Ferdinand Pineda. "Multi-parameter Regression of Photovoltaic Systems using Selection of Variables with the Method: Recursive Feature Elimination for Ridge, Lasso and Bayes." In Machine Learning, Optimization, and Data Science. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-64580-9_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wu, Weili, Jun Yuan, and Min Lv. "Current Status and Outlook of Artificial Intelligence Education Research in the Last Decade - Knowledge Graph Feature Selection and Visualisation Analysis Based on Lasso Regression and CiteSpace." In Advances in Social Science, Education and Humanities Research. Atlantis Press SARL, 2025. https://doi.org/10.2991/978-2-38476-400-6_57.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ipekten, Funda, Gözde Ertürk Zararsız, Halef Okan Doğan, Vahap Eldem, and Gökmen Zararsız. "Best Practices of Feature Selection in Multi-Omics Data." In Encyclopedia of Data Science and Machine Learning. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-7998-9220-5.ch122.

Full text
Abstract:
With the recent advances in molecular biology techniques such as next-generation sequencing, mass-spectrometry, etc., a large omic data is produced. Using such data, the expression levels of thousands of molecular features (genes, proteins, metabolites, etc.) can be quantified and associated with diseases. The fact that multiple omics data contains different types of data and the number of analyzed variables increases the complexity of the models created with machine learning methods. In addition, due to many variables, the investigation of molecular variables associated with diseases is very
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Lasso feature selection"

1

Mojahid, Hafiza Zoya, Jasni Mohamad Zain, Marina Yusoff, Abdul Basit, Abdul Kadir Jumaat, and Mushtaq Ali. "Refining COVID-19 Biomarker Classification through Automated LASSO-PCA Feature Selection." In 2024 IEEE 22nd Student Conference on Research and Development (SCOReD). IEEE, 2024. https://doi.org/10.1109/scored64708.2024.10872762.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Peya, Zahrul Jannat, Nurzahan Akter Joly, Md Sharzul Mostafa, Sharmina Al-Azad, Md Shymon Islam, and Sk Ahadul Alam. "Alzheimers Disease Detection through LASSO and RFE based Feature Selection from EEG Data." In 2024 IEEE International Conference on Biomedical Engineering, Computer and Information Technology for Health (BECITHCON). IEEE, 2024. https://doi.org/10.1109/becithcon64160.2024.10962755.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Royhan, Wilda, Sutarman, and Amalia Amalia. "Feature Selection Using Ensemble Lasso Regression, Random Forest and Recursive Feature Elimination Methods in Breast Cancer Classification." In 2025 International Conference on Computer Sciences, Engineering, and Technology Innovation (ICoCSETI). IEEE, 2025. https://doi.org/10.1109/icocseti63724.2025.11020560.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cheng, Jiali, Zhiqiang Cai, Chen Shen, and Ting Wang. "LASSO-BN for Selection and Optimization of Product Critical Quality Features." In 2024 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM). IEEE, 2024. https://doi.org/10.1109/ieem62345.2024.10857001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kim, Yongdai, and Jinseog Kim. "Gradient LASSO for feature selection." In Twenty-first international conference. ACM Press, 2004. http://dx.doi.org/10.1145/1015330.1015364.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gauraha, Niharika. "Stability Feature Selection using Cluster Representative LASSO." In International Conference on Pattern Recognition Applications and Methods. SCITEPRESS - Science and and Technology Publications, 2016. http://dx.doi.org/10.5220/0005827003810386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ming, Di, and Chris Ding. "Robust Flexible Feature Selection via Exclusive L21 Regularization." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/438.

Full text
Abstract:
Recently, exclusive lasso has demonstrated its promising results in selecting discriminative features for each class. The sparsity is enforced on each feature across all the classes via L12-norm. However, the exclusive sparsity of L12-norm could not screen out a large amount of irrelevant and redundant noise features in high-dimensional data space, since each feature belongs to at least one class. Thus, in this paper, we introduce a novel regularization called "exclusive L21", which is short for "L21 with exclusive lasso", towards robust flexible feature selection. The exclusive L21 regulariza
APA, Harvard, Vancouver, ISO, and other styles
8

Kumarage, Prabha M., B. Yogarajah, and Nagulan Ratnarajah. "Efficient Feature Selection for Prediction of Diabetic Using LASSO." In 2019 19th International Conference on Advances in ICT for Emerging Regions (ICTer). IEEE, 2019. http://dx.doi.org/10.1109/icter48817.2019.9023720.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Chengwen, Jianhui Li, and Jiadong Zhu. "Multi-label feature selection algorithm based on HSIC-Lasso." In International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2022), edited by Ruishi Liang and Jing Wang. SPIE, 2023. http://dx.doi.org/10.1117/12.2674561.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wu, Fei, Ying Yuan, and Yueting Zhuang. "Heterogeneous feature selection by group lasso with logistic regression." In the international conference. ACM Press, 2010. http://dx.doi.org/10.1145/1873951.1874129.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Lasso feature selection"

1

Chung, Steve, Jaymin Kwon, and Yushin Ahn. Forecasting Commercial Vehicle Miles Traveled (VMT) in Urban California Areas. Mineta Transportation Institute, 2024. http://dx.doi.org/10.31979/mti.2024.2315.

Full text
Abstract:
This study investigates commercial truck vehicle miles traveled (VMT) across six diverse California counties from 2000 to 2020. The counties—Imperial, Los Angeles, Riverside, San Bernardino, San Diego, and San Francisco—represent a broad spectrum of California’s demographics, economies, and landscapes. Using a rich dataset spanning demographics, economics, and pollution variables, we aim to understand the factors influencing commercial VMT. We first visually represent the geographic distribution of the counties, highlighting their unique characteristics. Linear regression models, particularly
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Yu, Hen-Geul Yeh, and Cesar Ortiz. Battery Management System Development for Electric Vehicles and Fast Charging Infrastructure Improvement. Mineta Transportation Institute, 2024. http://dx.doi.org/10.31979/mti.2024.2325.

Full text
Abstract:
The electric vehicle (EV) has become increasingly popular due to its being zero-emission. However, a significant challenge faced by EV drivers is the range anxiety associated with battery usage. Addressing this concern, this project develops a more efficient battery management system (BMS) for electric vehicles based on a real-time, state-of-charge (SOC) estimation. The proposed study delivers three modules: (1) a new equivalent circuit model (ECM) for lithium-ion batteries, (2) a new SOC estimator based on the moving horizon method, and (3) an on-board FPGA implementation of the classical Cou
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!