To see the other types of publications on this topic, follow the link: Bayesian LASSO.

Journal articles on the topic 'Bayesian LASSO'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Bayesian LASSO.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Park, Trevor, and George Casella. "The Bayesian Lasso." Journal of the American Statistical Association 103, no. 482 (June 1, 2008): 681–86. http://dx.doi.org/10.1198/016214508000000337.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hans, C. "Bayesian lasso regression." Biometrika 96, no. 4 (September 24, 2009): 835–45. http://dx.doi.org/10.1093/biomet/asp047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Leng, Chenlei, Minh-Ngoc Tran, and David Nott. "Bayesian adaptive Lasso." Annals of the Institute of Statistical Mathematics 66, no. 2 (September 3, 2013): 221–44. http://dx.doi.org/10.1007/s10463-013-0429-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kadhim Abbas, Haider. "Bayesian Lasso Tobit regression." Journal of Al-Qadisiyah for computer science and mathematics 11, no. 2 (August 26, 2019): 1–13. http://dx.doi.org/10.29304/jqcm.2019.11.2.553.

Full text
Abstract:
In the present research, we have proposed a new approach for model selection in Tobit regression. The new technique uses Bayesian Lasso in Tobit regression (BLTR). It has many features that give optimum estimation and variable selection property. Specifically, we introduced a new hierarchal model. Then, a new Gibbs sampler is introduced.We also extend the new approach by adding the ridge parameter inside the variance covariance matrix to avoid the singularity in the case of multicollinearity or in case the number of predictors greater than the number of observations. A comparison was made with other previous techniques applying the simulation examples and real data. It is worth mentioning, that the obtained results were promising and encouraging, giving better results compared to the previous methods.
APA, Harvard, Vancouver, ISO, and other styles
5

Mallick, Himel, Rahim Alhamzawi, Erina Paul, and Vladimir Svetnik. "The reciprocal Bayesian LASSO." Statistics in Medicine 40, no. 22 (June 14, 2021): 4830–49. http://dx.doi.org/10.1002/sim.9098.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chu, Haitao, Joseph G. Ibrahim, Zakaria S. Khondker, Weili Lin, and Hongtu Zhu. "The Bayesian covariance lasso." Statistics and Its Interface 6, no. 2 (2013): 243–59. http://dx.doi.org/10.4310/sii.2013.v6.n2.a8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mallick, Himel, and Nengjun Yi. "A new Bayesian lasso." Statistics and Its Interface 7, no. 4 (2014): 571–82. http://dx.doi.org/10.4310/sii.2014.v7.n4.a12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kawano, Shuichi, Ibuki Hoshina, Kaito Shimamura, and Sadanori Konishi. "PREDICTIVE MODEL SELECTION CRITERIA FOR BAYESIAN LASSO REGRESSION." Journal of the Japanese Society of Computational Statistics 28, no. 1 (2015): 67–82. http://dx.doi.org/10.5183/jjscs.1501001_220.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alhamzawi, Rahim, and Keming Yu. "Bayesian Lasso-mixed quantile regression." Journal of Statistical Computation and Simulation 84, no. 4 (October 12, 2012): 868–80. http://dx.doi.org/10.1080/00949655.2012.731689.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alhamzawi, Rahim, and Haithem Taha Mohammad Ali. "The Bayesian adaptive lasso regression." Mathematical Biosciences 303 (September 2018): 75–82. http://dx.doi.org/10.1016/j.mbs.2018.06.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Alhamzawi, Rahim, Keming Yu, and Dries F. Benoit. "Bayesian adaptive Lasso quantile regression." Statistical Modelling: An International Journal 12, no. 3 (May 18, 2012): 279–97. http://dx.doi.org/10.1177/1471082x1101200304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Benoit, Dries F., Rahim Alhamzawi, and Keming Yu. "Bayesian lasso binary quantile regression." Computational Statistics 28, no. 6 (July 28, 2013): 2861–73. http://dx.doi.org/10.1007/s00180-013-0439-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Alhamzawi, Rahim, and Haithem Taha Mohammad Ali. "Bayesian Iterative Adaptive Lasso Quantile Regression." IOSR Journal of Mathematics 13, no. 03 (June 2017): 38–42. http://dx.doi.org/10.9790/5728-1303033842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Linder, Daniel F., Viral Panchal, Hani Samawi, and Duchwan Ryu. "Balanced Bayesian LASSO for heavy tails." Journal of Statistical Computation and Simulation 86, no. 6 (June 15, 2015): 1115–32. http://dx.doi.org/10.1080/00949655.2015.1053886.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Khare, Kshitij, and James P. Hobert. "Geometric ergodicity of the Bayesian lasso." Electronic Journal of Statistics 7 (2013): 2150–63. http://dx.doi.org/10.1214/13-ejs841.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Shimamura, Kaito, Shuichi Kawano, and Sadanori Konishi. "Bayesian Lasso Regression Modeling via Model Averaging." Japanese Journal of Applied Statistics 44, no. 3 (2015): 101–17. http://dx.doi.org/10.5023/jappstat.44.101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Alhseeni, Ameer Musa Imran, and Ali Abdulmohsin Abdulraeem Al-rubaye. "New penalized Bayesian adaptive lasso binary regression." Periodicals of Engineering and Natural Sciences (PEN) 9, no. 1 (February 21, 2021): 285. http://dx.doi.org/10.21533/pen.v9i1.1800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Yi, Nengjun, and Shizhong Xu. "Bayesian LASSO for Quantitative Trait Loci Mapping." Genetics 179, no. 2 (May 27, 2008): 1045–55. http://dx.doi.org/10.1534/genetics.107.085589.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Guo, Ruixin, Hongtu Zhu, Sy-Miin Chow, and Joseph G. Ibrahim. "Bayesian Lasso for Semiparametric Structural Equation Models." Biometrics 68, no. 2 (February 29, 2012): 567–77. http://dx.doi.org/10.1111/j.1541-0420.2012.01751.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Chen, Xiaohui, Z. Jane Wang, and Martin J. McKeown. "A Bayesian Lasso via reversible-jump MCMC." Signal Processing 91, no. 8 (August 2011): 1920–32. http://dx.doi.org/10.1016/j.sigpro.2011.02.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Alhamzawi, Rahim, and Haithem Taha Mohammad Ali. "A new Gibbs sampler for Bayesian lasso." Communications in Statistics - Simulation and Computation 49, no. 7 (November 15, 2018): 1855–71. http://dx.doi.org/10.1080/03610918.2018.1508699.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Sami, Zainab, and Taha Alshaybawee. "Bayesian Variable Selection for Semiparametric Logistic Regression." Al-Qadisiyah Journal Of Pure Science 26, no. 5 (December 24, 2021): 44–57. http://dx.doi.org/10.29350/qjps.2021.26.5.1460.

Full text
Abstract:
Lasso variable selection is an attractive approach to improve the prediction accuracy. Bayesian lasso approach is suggested to estimate and select the important variables for single index logistic regression model. Laplace distribution is set as prior to the coefficients vector and prior to the unknown link function (Gaussian process). A hierarchical Bayesian lasso semiparametric logistic regression model is constructed and MCMC algorithm is adopted for posterior inference. To evaluate the performance of the proposed method BSLLR is through comparing it to three existing methods BLR, BPR and BBQR. Simulation examples and numerical data are to be considered. The results indicate that the proposed method get the smallest bias, SD, MSE and MAE in simulation and real data. The proposed method BSLLR performs better than other methods.
APA, Harvard, Vancouver, ISO, and other styles
23

Li, Bohan, and Juan Wu. "Bayesian bootstrap adaptive lasso estimators of regression models." Journal of Statistical Computation and Simulation 91, no. 8 (January 11, 2021): 1651–80. http://dx.doi.org/10.1080/00949655.2020.1865959.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Hoshina, Ibuki. "SPARSE REGRESSION MODELING VIA THE MAP BAYESIAN LASSO." Bulletin of informatics and cybernetics 47 (December 2015): 37–58. http://dx.doi.org/10.5109/1909523.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Fu, Shuai. "Hierarchical Bayesian LASSO for a negative binomial regression." Journal of Statistical Computation and Simulation 86, no. 11 (October 29, 2015): 2182–203. http://dx.doi.org/10.1080/00949655.2015.1106541.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Dasgupta, Shibasish. "High-dimensional posterior consistency of the Bayesian lasso." Communications in Statistics - Theory and Methods 45, no. 22 (January 25, 2016): 6700–6708. http://dx.doi.org/10.1080/03610926.2014.966840.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Kadhim Abbas Hilali, Haider, and Rahim Alhamzawi. "Bayesian Adaptive Lasso binary regression with ridge parameter." Journal of Physics: Conference Series 1294 (September 2019): 032036. http://dx.doi.org/10.1088/1742-6596/1294/3/032036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Češnovar, Rok, and Erik Štrumbelj. "Bayesian Lasso and multinomial logistic regression on GPU." PLOS ONE 12, no. 6 (June 28, 2017): e0180343. http://dx.doi.org/10.1371/journal.pone.0180343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Betancourt, Brenda, Abel Rodríguez, and Naomi Boyd. "Bayesian Fused Lasso Regression for Dynamic Binary Networks." Journal of Computational and Graphical Statistics 26, no. 4 (October 2, 2017): 840–50. http://dx.doi.org/10.1080/10618600.2017.1341323.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

An, Ziwen, Leah F. South, David J. Nott, and Christopher C. Drovandi. "Accelerating Bayesian Synthetic Likelihood With the Graphical Lasso." Journal of Computational and Graphical Statistics 28, no. 2 (February 19, 2019): 471–75. http://dx.doi.org/10.1080/10618600.2018.1537928.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Bedoui, Adel, and Nicole A. Lazar. "Bayesian empirical likelihood for ridge and lasso regressions." Computational Statistics & Data Analysis 145 (May 2020): 106917. http://dx.doi.org/10.1016/j.csda.2020.106917.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Xu, Xiaofan, and Malay Ghosh. "Bayesian Variable Selection and Estimation for Group Lasso." Bayesian Analysis 10, no. 4 (December 2015): 909–36. http://dx.doi.org/10.1214/14-ba929.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hefley, Trevor J., Mevin B. Hooten, Ephraim M. Hanks, Robin E. Russell, and Daniel P. Walsh. "The Bayesian Group Lasso for Confounded Spatial Data." Journal of Agricultural, Biological and Environmental Statistics 22, no. 1 (January 5, 2017): 42–59. http://dx.doi.org/10.1007/s13253-016-0274-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Das, Kiranmoy, and Marc Sobel. "Dirichlet Lasso: A Bayesian approach to variable selection." Statistical Modelling: An International Journal 15, no. 3 (November 27, 2014): 215–32. http://dx.doi.org/10.1177/1471082x14551245.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Li, Jiahan, Kiranmoy Das, Guifang Fu, Runze Li, and Rongling Wu. "The Bayesian lasso for genome-wide association studies." Bioinformatics 27, no. 4 (December 14, 2010): 516–23. http://dx.doi.org/10.1093/bioinformatics/btq688.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Shimamura, Kaito, Masao Ueki, Shuichi Kawano, and Sadanori Konishi. "Bayesian generalized fused lasso modeling via NEG distribution." Communications in Statistics - Theory and Methods 48, no. 16 (November 17, 2018): 4132–53. http://dx.doi.org/10.1080/03610926.2018.1489056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Wang, Hao. "Bayesian Graphical Lasso Models and Efficient Posterior Computation." Bayesian Analysis 7, no. 4 (December 2012): 867–86. http://dx.doi.org/10.1214/12-ba729.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Zhou, Xiaofei, Meng Wang, and Shili Lin. "Detecting rare haplotypes associated with complex diseases using both population and family data: Combined logistic Bayesian Lasso." Statistical Methods in Medical Research 29, no. 11 (June 4, 2020): 3340–50. http://dx.doi.org/10.1177/0962280220927728.

Full text
Abstract:
Haplotype-based association methods have been developed to understand the genetic architecture of complex diseases. Compared to single-variant-based methods, haplotype methods are thought to be more biologically relevant, since there are typically multiple non-independent genetic variants involved in complex diseases, and the use of haplotypes implicitly accounts for non-independence caused by linkage disequilibrium. In recent years, with the focus moving from common to rare variants, haplotype-based methods have also evolved accordingly to uncover the roles of rare haplotypes. One particular approach is regularization-based, with the use of Bayesian least absolute shrinkage and selection operator (Lasso) as an example. This type of methods has been developed for either case-control population data (the logistic Bayesian Lasso (LBL)) or family data (family-triad-based logistic Bayesian Lasso (famLBL)). In some situations, both family data and case-control data are available; therefore, it would be a waste of resources if only one of them could be analyzed. To make full usage of available data to increase power, we propose a unified approach that can combine both case-control and family data (combined logistic Bayesian Lasso (cLBL)). Through simulations, we characterized the performance of cLBL and showed the advantage of cLBL over existing methods. We further applied cLBL to the Framingham Heart Study data to demonstrate its utility in real data applications.
APA, Harvard, Vancouver, ISO, and other styles
39

Guo, Ao Tuo. "Research on Bayesian Model Averaging for Lasso Based on Analysis of Scientific Materials." Advanced Materials Research 282-283 (July 2011): 334–37. http://dx.doi.org/10.4028/www.scientific.net/amr.282-283.334.

Full text
Abstract:
The Lasso (least absolute shrinkage and selection operator) estimates a vector of regression coefficients by minimizing the residual sum of squares subject to a constraint on the -norm of coefficient vector, which has been an attractive technique for regularization and variable selection. In this paper, we study the Bayesian Model Averaging(BMA) for Lasso, which accounts for the uncertainty about the best model to choose by averaging over multiple models. Experimental results on simulated data show that BMA has significant advantage over the model selection method based on Bayesian information criterion (BIC).
APA, Harvard, Vancouver, ISO, and other styles
40

Cao, Ming, Yue Fan, and Qinke Peng. "Bayesian Gene Selection Based on Pathway Information and Network-Constrained Regularization." Computational and Mathematical Methods in Medicine 2021 (August 4, 2021): 1–9. http://dx.doi.org/10.1155/2021/7471516.

Full text
Abstract:
High-throughput data make it possible to study expression levels of thousands of genes simultaneously under a particular condition. However, only few of the genes are discriminatively expressed. How to identify these biomarkers precisely is significant for disease diagnosis, prognosis, and therapy. Many studies utilized pathway information to identify the biomarkers. However, most of these studies only incorporate the group information while the pathway structural information is ignored. In this paper, we proposed a Bayesian gene selection with a network-constrained regularization method, which can incorporate the pathway structural information as priors to perform gene selection. All the priors are conjugated; thus, the parameters can be estimated effectively through Gibbs sampling. We present the application of our method on 6 microarray datasets, comparing with Bayesian Lasso, Bayesian Elastic Net, and Bayesian Fused Lasso. The results show that our method performs better than other Bayesian methods and pathway structural information can improve the result.
APA, Harvard, Vancouver, ISO, and other styles
41

Guo, Hongping, Zuguo Yu, Jiyuan An, Guosheng Han, Yuanlin Ma, and Runbin Tang. "A Two-Stage Mutual Information Based Bayesian Lasso Algorithm for Multi-Locus Genome-Wide Association Studies." Entropy 22, no. 3 (March 13, 2020): 329. http://dx.doi.org/10.3390/e22030329.

Full text
Abstract:
Genome-wide association study (GWAS) has turned out to be an essential technology for exploring the genetic mechanism of complex traits. To reduce the complexity of computation, it is well accepted to remove unrelated single nucleotide polymorphisms (SNPs) before GWAS, e.g., by using iterative sure independence screening expectation-maximization Bayesian Lasso (ISIS EM-BLASSO) method. In this work, a modified version of ISIS EM-BLASSO is proposed, which reduces the number of SNPs by a screening methodology based on Pearson correlation and mutual information, then estimates the effects via EM-Bayesian Lasso (EM-BLASSO), and finally detects the true quantitative trait nucleotides (QTNs) through likelihood ratio test. We call our method a two-stage mutual information based Bayesian Lasso (MBLASSO). Under three simulation scenarios, MBLASSO improves the statistical power and retains the higher effect estimation accuracy when comparing with three other algorithms. Moreover, MBLASSO performs best on model fitting, the accuracy of detected associations is the highest, and 21 genes can only be detected by MBLASSO in Arabidopsis thaliana datasets.
APA, Harvard, Vancouver, ISO, and other styles
42

Khadem bashiri, Zahra, Ali Shadrokh, and Masoud Yarmohammadi. "Bayesian LASSO Regression with Asymmetric Error in High Dimensional." Journal of Statistical Sciences 15, no. 1 (September 1, 2021): 81–96. http://dx.doi.org/10.52547/jss.15.1.5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Yuan, J., and G. Wei. "An efficient Monte Carlo EM algorithm for Bayesian lasso." Journal of Statistical Computation and Simulation 84, no. 10 (April 8, 2013): 2166–86. http://dx.doi.org/10.1080/00949655.2013.786080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Schmidt, Daniel F., and Enes Makalic. "Estimation of stationary autoregressive models with the Bayesian LASSO." Journal of Time Series Analysis 34, no. 5 (August 23, 2013): 517–31. http://dx.doi.org/10.1111/jtsa.12027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Feng, Xiang-Nan, Hao-Tian Wu, and Xin-Yuan Song. "Bayesian Adaptive Lasso for Ordinal Regression With Latent Variables." Sociological Methods & Research 46, no. 4 (October 23, 2015): 926–53. http://dx.doi.org/10.1177/0049124115610349.

Full text
Abstract:
We consider an ordinal regression model with latent variables to investigate the effects of observable and latent explanatory variables on the ordinal responses of interest. Each latent variable is characterized by correlated observed variables through a confirmatory factor analysis model. We develop a Bayesian adaptive lasso procedure to conduct simultaneous estimation and variable selection. Nice features including empirical performance of the proposed methodology are demonstrated by simulation studies. The model is applied to a study on happiness and its potential determinants from the Inter-university Consortium for Political and Social Research.
APA, Harvard, Vancouver, ISO, and other styles
46

Gao, Junbin, Paul W. Kwan, and Daming Shi. "Sparse kernel learning with LASSO and Bayesian inference algorithm." Neural Networks 23, no. 2 (March 2010): 257–64. http://dx.doi.org/10.1016/j.neunet.2009.07.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Hans, Chris. "Model uncertainty and variable selection in Bayesian lasso regression." Statistics and Computing 20, no. 2 (November 25, 2009): 221–29. http://dx.doi.org/10.1007/s11222-009-9160-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Cozzini, Alberto, Ajay Jasra, Giovanni Montana, and Adam Persing. "A Bayesian mixture of lasso regressions with t-errors." Computational Statistics & Data Analysis 77 (September 2014): 84–97. http://dx.doi.org/10.1016/j.csda.2014.03.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Yan, Zhengbing, Yuan Yao, Tsai-Bang Huang, and Yi-Sern Wong. "Reconstruction-Based Multivariate Process Fault Isolation Using Bayesian Lasso." Industrial & Engineering Chemistry Research 57, no. 30 (March 2, 2018): 9779–87. http://dx.doi.org/10.1021/acs.iecr.7b05189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Kang, Kai, Xinyuan Song, X. Joan Hu, and Hongtu Zhu. "Bayesian adaptive group lasso with semiparametric hidden Markov models." Statistics in Medicine 38, no. 9 (November 28, 2018): 1634–50. http://dx.doi.org/10.1002/sim.8051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography