To see the other types of publications on this topic, follow the link: And LASSO regression.

Journal articles on the topic 'And LASSO regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'And LASSO regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ranstam, J., and J. A. Cook. "LASSO regression." British Journal of Surgery 105, no. 10 (2018): 1348. http://dx.doi.org/10.1002/bjs.10895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hans, C. "Bayesian lasso regression." Biometrika 96, no. 4 (2009): 835–45. http://dx.doi.org/10.1093/biomet/asp047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ahrens, Achim, Christian B. Hansen, and Mark E. Schaffer. "lassopack: Model selection and prediction with regularized regression in Stata." Stata Journal: Promoting communications on statistics and Stata 20, no. 1 (2020): 176–235. http://dx.doi.org/10.1177/1536867x20909697.

Full text
Abstract:
In this article, we introduce lassopack, a suite of programs for regularized regression in Stata. lassopack implements lasso, square-root lasso, elastic net, ridge regression, adaptive lasso, and postestimation ordinary least squares. The methods are suitable for the high-dimensional setting, where the number of predictors p may be large and possibly greater than the number of observations, n. We offer three approaches for selecting the penalization (“tuning”) parameters: information criteria (implemented in lasso2), K-fold cross-validation and h-step-ahead rolling cross-validation for cross-s
APA, Harvard, Vancouver, ISO, and other styles
4

Kadhim Abbas, Haider. "Bayesian Lasso Tobit regression." Journal of Al-Qadisiyah for computer science and mathematics 11, no. 2 (2019): 1–13. http://dx.doi.org/10.29304/jqcm.2019.11.2.553.

Full text
Abstract:
In the present research, we have proposed a new approach for model selection in Tobit regression. The new technique uses Bayesian Lasso in Tobit regression (BLTR). It has many features that give optimum estimation and variable selection property. Specifically, we introduced a new hierarchal model. Then, a new Gibbs sampler is introduced.We also extend the new approach by adding the ridge parameter inside the variance covariance matrix to avoid the singularity in the case of multicollinearity or in case the number of predictors greater than the number of observations. A comparison was made with
APA, Harvard, Vancouver, ISO, and other styles
5

Xu, Huan, Constantine Caramanis, and Shie Mannor. "Robust Regression and Lasso." IEEE Transactions on Information Theory 56, no. 7 (2010): 3561–74. http://dx.doi.org/10.1109/tit.2010.2048503.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Neykov, Matey. "Isotonic regression meets LASSO." Electronic Journal of Statistics 13, no. 1 (2019): 710–46. http://dx.doi.org/10.1214/19-ejs1537.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hafsa, Fathima, Juveria Soha, Fathima Rida, and Hifsa Naaz Syeda. "An Analysis of Car Price Prediction Using Machine Learning." Research and Reviews: Advancement in Cyber Security 2, no. 2 (2025): 33–40. https://doi.org/10.5281/zenodo.15308198.

Full text
Abstract:
<em>Car price prediction is a critical task in the automotive industry, enabling buyers, sellers, and financial institutions to make informed and objective decisions. This research focuses on applying machine learning techniques, specifically Linear Regression and Lasso Regression to predict used car prices based on multiple factors including fuel type, transmission, seller type, vehicle age, and kilometers driven. The dataset was carefully preprocessed to handle missing values and encode categorical variables, ensuring the data was suitable for model training. Both models were evaluated using
APA, Harvard, Vancouver, ISO, and other styles
8

Alhamzawi, Rahim, and Keming Yu. "Bayesian Lasso-mixed quantile regression." Journal of Statistical Computation and Simulation 84, no. 4 (2012): 868–80. http://dx.doi.org/10.1080/00949655.2012.731689.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alhamzawi, Ahmed. "Tobit regression with Lasso penalty." Journal of Physics: Conference Series 1664 (November 2020): 012046. http://dx.doi.org/10.1088/1742-6596/1664/1/012046.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alhamzawi, Rahim, and Haithem Taha Mohammad Ali. "The Bayesian adaptive lasso regression." Mathematical Biosciences 303 (September 2018): 75–82. http://dx.doi.org/10.1016/j.mbs.2018.06.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Lee, Seokho, and Seonhwa Kim. "Marginalized lasso in sparse regression." Journal of the Korean Statistical Society 48, no. 3 (2019): 396–411. http://dx.doi.org/10.1016/j.jkss.2018.12.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Alhamzawi, Rahim, Keming Yu, and Dries F. Benoit. "Bayesian adaptive Lasso quantile regression." Statistical Modelling: An International Journal 12, no. 3 (2012): 279–97. http://dx.doi.org/10.1177/1471082x1101200304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Vidaurre, Diego, Concha Bielza, and Pedro Larrañaga. "Lazy lasso for local regression." Computational Statistics 27, no. 3 (2011): 531–50. http://dx.doi.org/10.1007/s00180-011-0274-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Benoit, Dries F., Rahim Alhamzawi, and Keming Yu. "Bayesian lasso binary quantile regression." Computational Statistics 28, no. 6 (2013): 2861–73. http://dx.doi.org/10.1007/s00180-013-0439-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Kawano, Shuichi, Ibuki Hoshina, Kaito Shimamura, and Sadanori Konishi. "PREDICTIVE MODEL SELECTION CRITERIA FOR BAYESIAN LASSO REGRESSION." Journal of the Japanese Society of Computational Statistics 28, no. 1 (2015): 67–82. http://dx.doi.org/10.5183/jjscs.1501001_220.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

D, Christy Sujatha, and Gnana Jayanthi Dr.J. "LASH Tree: LASSO Regression Hoeffding for Streaming Data." International Journal of Psychosocial Rehabilitation 24, no. 04 (2020): 3022–33. http://dx.doi.org/10.37200/ijpr/v24i4/pr201415.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Mohammed, Bahr, Saif Raheem, and Tahir Dikheel. "Sliced Inverse Regression (SIR) via group lasso." Journal of Al-Rafidain University College For Sciences ( Print ISSN: 1681-6870 ,Online ISSN: 2790-2293 ), no. 1 (October 1, 2021): 505–12. http://dx.doi.org/10.55562/jrucs.v46i1.101.

Full text
Abstract:
In this paper the authors propose a group-lasso for sliced inverse regression (group lasso-SIR). This proposed method can deal with the problem of correlation existence between predictor variables. Simulation is used to investigate the performance of proposed method comparing with ridge and lasso in sliced inverse regression (lasso-SIR). The results show that the group lasso-SIR method is performs well comparing with other methods depending on Mean Square Errors (MSE) criterion.
APA, Harvard, Vancouver, ISO, and other styles
18

Xin, Seng Jia, and Kamil Khalid. "Modelling House Price Using Ridge Regression and Lasso Regression." International Journal of Engineering & Technology 7, no. 4.30 (2018): 498. http://dx.doi.org/10.14419/ijet.v7i4.30.22378.

Full text
Abstract:
House price prediction is important for the government, finance company, real estate sector and also the house owner. The data of the house price at Ames, Iowa in United State which from the year 2006 to 2010 is used for multivariate analysis. However, multicollinearity is commonly occurred in the multivariate analysis and gives a serious effect to the model. Therefore, in this study investigates the performance of the Ridge regression model and Lasso regression model as both regressions can deal with multicollinearity. Ridge regression model and Lasso regression model are constructed and comp
APA, Harvard, Vancouver, ISO, and other styles
19

Sinanta P.W.J., Moses. "PREDIKSI HARGA MOBIL MENGGUNAKAN LINEAR REGRESSION, RIDGE REGRESSION DAN LASSO REGRESSION." Jurnal Review Pendidikan dan Pengajaran 8, no. 1 (2025): 3066–72. https://doi.org/10.31004/jrpp.v8i1.43145.

Full text
Abstract:
Prediksi harga mobil merupakan aspek penting dalam industri otomotif, membantu produsen, dealer dan konsumen dalam menentukan nilai pasar kendaraan secara objektif. Berbagai faktor seperti merek, tahun produksi, jenis bahan bakar, dan lain-lain dapat memengaruhi harga mobil, sehingga diperlukan model prediktif yang akurat. Penelitian ini bertujuan untuk menganalisis performa Linear Regression, Ridge Regression dan Lasso Regression dalam memprediksi harga mobil. Data yang digunakan diproses melalui One Hot Encoding dan Standard Scaler, kemudian dibagi menjadi data training dan testing dengan ra
APA, Harvard, Vancouver, ISO, and other styles
20

Daniel, Adashu Jacob, Musa Dahiru Ibrahim, and Anule Aondulum Josaphat. "Modeling Stock Data Using Multiple Linear Regression and LASSO Regression Analysis." Mikailalsys Journal of Mathematics and Statistics 3, no. 2 (2025): 490–99. https://doi.org/10.58578/mjms.v3i2.5927.

Full text
Abstract:
This study evaluates and compares the model fitting and predictive performance of Multiple Linear Regression (MLR) and Least Absolute Shrinkage and Selection Operator (LASSO) regression in the context of stock price prediction for four leading Nigerian companies. A dataset comprising 1,300 observations from 2019 to 2025 was obtained from Yahoo Finance and Investing.com. Multicollinearity assessment using the Variance Inflation Factor (VIF) revealed substantial collinearity among certain predictors, particularly for the variables "Open" (Honeywell: 55.45; Zenith: 920.30) and "Low" (Oando: 621.8
APA, Harvard, Vancouver, ISO, and other styles
21

Mastourah, Abdulsatar Ahmeedah, A.Sh Abdalla Abdelbaset, and M. Mami Ahmed. "On using the penalized regression estimators to solve the multicollinearity problem." World Journal of Advanced Research and Reviews 24, no. 1 (2024): 2654–64. https://doi.org/10.5281/zenodo.15064625.

Full text
Abstract:
The paper compares coefficient parameter estimation efficiency using penalized regression approaches. Five estimators are employed: Ridge Regression, LASSO regression, Elastic Net (ENET) Regression, Adaptive Lasso (ALASSO) regression, and Adaptive Elastic Net (AENET) regression methods. The study uses a multiple linear regression model to address multicollinearity issues. The comparison is based on average mean square errors (MSE) using simulated data with varying sizes, numbers of independent variables, and correlation coefficients. The results are expected to be useful and will be applied to
APA, Harvard, Vancouver, ISO, and other styles
22

ZHANG, Lijin, Xiayan WEI, Jiaqi LU, and Junhao PAN. "Lasso regression: From explanation to prediction." Advances in Psychological Science 28, no. 10 (2020): 1777. http://dx.doi.org/10.3724/sp.j.1042.2020.01777.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Alhamzawi, Rahim, and Haithem Taha Mohammad Ali. "Bayesian Iterative Adaptive Lasso Quantile Regression." IOSR Journal of Mathematics 13, no. 03 (2017): 38–42. http://dx.doi.org/10.9790/5728-1303033842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Meier, Lukas, Sara Van De Geer, and Peter Bühlmann. "The group lasso for logistic regression." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 70, no. 1 (2008): 53–71. http://dx.doi.org/10.1111/j.1467-9868.2007.00627.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Kaul, Abhishek. "Lasso with long memory regression errors." Journal of Statistical Planning and Inference 153 (October 2014): 11–26. http://dx.doi.org/10.1016/j.jspi.2014.05.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Lin, Qian, Zhigen Zhao, and Jun S. Liu. "Sparse Sliced Inverse Regression via Lasso." Journal of the American Statistical Association 114, no. 528 (2019): 1726–39. http://dx.doi.org/10.1080/01621459.2018.1520115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Xu, Mengyu. "Sales Prediction Based on Lasso Regression." Highlights in Science, Engineering and Technology 88 (March 29, 2024): 343–49. http://dx.doi.org/10.54097/p9hyrk70.

Full text
Abstract:
Sales prediction is a critical aspect for businesses across diverse fields, providing them with the means to operate efficiently and achieve success. It constitutes an integral component of the decision-making and planning processes within a business. Several forecasting models are available for sales prediction, with most machine learning models performing exceptionally well. However, the suitability of these models can vary depending on the dataset provided. While many datasets are not overly complex and contain a limited number of variables, others are more intricate, featuring numerous var
APA, Harvard, Vancouver, ISO, and other styles
28

Lei, Hejie, Xingke Chen, and Ling Jian. "Canal-LASSO: A sparse noise-resilient online linear regression model." Intelligent Data Analysis 24, no. 5 (2020): 993–1010. http://dx.doi.org/10.3233/ida-194672.

Full text
Abstract:
Least absolute shrinkage and selection operator (LASSO) is one of the most commonly used methods for shrinkage estimation and variable selection. Robust variable selection methods via penalized regression, such as least absolute deviation LASSO (LAD-LASSO), etc., have gained growing attention in works of literature. However those penalized regression procedures are still sensitive to noisy data. Furthermore, “concept drift” makes learning from streaming data fundamentally different from the traditional batch learning. Focusing on the shrinkage estimation and variable selection tasks on noisy s
APA, Harvard, Vancouver, ISO, and other styles
29

Zhou, Di Jody, Rajpreet Chahal, Ian H. Gotlib, and Siwei Liu. "Comparison of lasso and stepwise regression in psychological data." Methodology 20, no. 2 (2024): 121–43. http://dx.doi.org/10.5964/meth.11523.

Full text
Abstract:
Identifying significant predictors of behavioral outcomes is of great interest in many psychological studies. Lasso regression, as an alternative to stepwise regression for variable selection, has started gaining traction among psychologists. Yet, further investigation is valuable to fully understand its performance across various psychological data conditions. Using a Monte Carlo simulation and an empirical demonstration, we compared Lasso regression to stepwise regression in typical psychological datasets varying in sample size, predictor size, sparsity, and signal-to-noise ratio. We found t
APA, Harvard, Vancouver, ISO, and other styles
30

Suruddin, Adzkar Adlu Hasyr, Erfiani Erfiani, and I. Made Sumertajaya. "The Continuum Regression Analysis with Preprocessed Variable Selection LASSO and SIR-LASSO." Inferensi 8, no. 1 (2025): 45. https://doi.org/10.12962/j27213862.v8i1.21658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Freire, W. P. "The NFDA-Nonsmooth Feasible Directions Algorithm applied to construction of Pareto Fronts of Ridge and Lasso Regressions." Trends in Computational and Applied Mathematics 25, no. 1 (2024): e01767. https://doi.org/10.5540/tcam.2024.025.e01767.

Full text
Abstract:
Ridge and Lasso regressions are types of linear regression, a machine learning tool for dealing with data. Based on multiobjective optimization theory, we transform Ridge and Lasso regression into bi-objective optimization problems. The Pareto fronts of the resulting problems provide a range of regression models from which the best one can be selected. We employ the NFDA-Nonsmooth Feasible Directions Algorithm devised for solving convex optimization problems to construct the Pareto fronts of Ridge and Lasso when regarded as bi-objective problems.
APA, Harvard, Vancouver, ISO, and other styles
32

Alshqaq, Shokrya Saleh, and Ali H. Abuzaid. "An Efficient Method for Variable Selection Based on Diagnostic-Lasso Regression." Symmetry 15, no. 12 (2023): 2155. http://dx.doi.org/10.3390/sym15122155.

Full text
Abstract:
In contemporary statistical methods, robust regression shrinkage and variable selection have gained paramount significance due to the prevalence of datasets characterized by contamination and an abundance of variables, often categorized as ‘high-dimensional data’. The Least Absolute Shrinkage and Selection Operator (Lasso) is frequently employed in this context for both the model and selecting variables. However, no one has attempted to apply regression diagnostic measures to Lasso regression, despite its power and widespread practical use. This work introduces a combined Lasso and diagnostic
APA, Harvard, Vancouver, ISO, and other styles
33

Shin, Juh, Eun Jeong Min, Sun Ok Jung, and Jung Eun Kim. "UTILIZATION OF LASSO AND POISSON REGRESSION DEALING WITH COUNT VARIABLES IN NURSING HOME RESEARCH." Innovation in Aging 7, Supplement_1 (2023): 906–7. http://dx.doi.org/10.1093/geroni/igad104.2916.

Full text
Abstract:
Abstract Regression has been used for decades; thus, more empirical studies are necessary to choose appropriate regression methods, as traditional regression has many drawbacks and is limited in analysis and synthesis with large numbers of covariates. This study aims to investigate factors related to pressure ulcers in nursing homes using three regressions: least absolute shrinkage and selection operator (LASSO) Poisson, LASSO linear, and LASSO logistic regression. We represent how each model’s estimating, inferencing, and performance measures work. The datasets accrued from 51 nursing homes t
APA, Harvard, Vancouver, ISO, and other styles
34

Pardede, Jasman. "Prediksi Jumlah Target dan Realisasi Wajib Pajak Atas PBB – P2 Menggunakan Multi Regression, Regression Lasso, dan PCRPrediksi Jumlah Target dan Realisasi Wajib Pajak Atas PBB – P2 Menggunakan Multi Regression, Regression Lasso, dan PCR." Jurnal Edukasi dan Penelitian Informatika (JEPIN) 10, no. 1 (2024): 1. http://dx.doi.org/10.26418/jp.v10i1.68890.

Full text
Abstract:
Perubahan besaran Pajak Bumi dan Bangunan memberikan dampak bagi beberapa sektor maupun masyarakat di Kota Bandung, karena perubahan yang cukup signifikan dalam besaran Pajak Bumi dan Bangunan ini memberikan pengaruh pada kesadaran dan juga kepedulian masyarakat dalam membayar pajak. Terdapat beberapa penggunaan machine learning dalam penentuan pajak ini dimana salah satunya adalah dengan memprediksikan sebuah besaran Target dan Realisasi pada Pajak Bumi Bangunan, sehingga dilakukan sebuah penelitian dengan membandingkan metode Multi Regression, Regression Lasso, dan PCR (Principle Component R
APA, Harvard, Vancouver, ISO, and other styles
35

Alhseeni, Ameer Musa Imran, and Ali Abdulmohsin Abdulraeem Al-rubaye. "New penalized Bayesian adaptive lasso binary regression." Periodicals of Engineering and Natural Sciences (PEN) 9, no. 1 (2021): 285. http://dx.doi.org/10.21533/pen.v9i1.1800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Shimamura, Kaito, Shuichi Kawano, and Sadanori Konishi. "Bayesian Lasso Regression Modeling via Model Averaging." Japanese Journal of Applied Statistics 44, no. 3 (2015): 101–17. http://dx.doi.org/10.5023/jappstat.44.101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Bang, Sungwan, Hyungjun Cho, and Myoungshic Jhun. "Adaptive lasso penalised censored composite quantile regression." International Journal of Data Mining and Bioinformatics 15, no. 1 (2016): 22. http://dx.doi.org/10.1504/ijdmb.2016.076015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Mei, Ziwei, and Zhentao Shi. "On LASSO for high dimensional predictive regression." Journal of Econometrics 242, no. 2 (2024): 105809. http://dx.doi.org/10.1016/j.jeconom.2024.105809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Ciuperca, Gabriela. "Adaptive fused LASSO in grouped quantile regression." Journal of Statistical Theory and Practice 11, no. 1 (2016): 107–25. http://dx.doi.org/10.1080/15598608.2016.1258601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Zhao, Yihong, R. Todd Ogden, and Philip T. Reiss. "Wavelet-Based LASSO in Functional Linear Regression." Journal of Computational and Graphical Statistics 21, no. 3 (2012): 600–617. http://dx.doi.org/10.1080/10618600.2012.679241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Rajaratnam, Bala, Steven Roberts, Doug Sparks, and Honglin Yu. "Influence Diagnostics for High-Dimensional Lasso Regression." Journal of Computational and Graphical Statistics 28, no. 4 (2019): 877–90. http://dx.doi.org/10.1080/10618600.2019.1598869.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Chang, Le, Steven Roberts, and Alan Welsh. "Robust Lasso Regression Using Tukey's Biweight Criterion." Technometrics 60, no. 1 (2017): 36–47. http://dx.doi.org/10.1080/00401706.2017.1305299.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Giurcanu, Mihai, and Brett Presnell. "Bootstrapping LASSO-type estimators in regression models." Journal of Statistical Planning and Inference 199 (March 2019): 114–25. http://dx.doi.org/10.1016/j.jspi.2018.05.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Kim, Choongrak, Jungsu Lee, Hojin Yang, and Whasoo Bae. "Case influence diagnostics in the lasso regression." Journal of the Korean Statistical Society 44, no. 2 (2015): 271–79. http://dx.doi.org/10.1016/j.jkss.2014.09.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Hashem, Hussein, Veronica Vinciotti, Rahim Alhamzawi, and Keming Yu. "Quantile regression with group lasso for classification." Advances in Data Analysis and Classification 10, no. 3 (2015): 375–90. http://dx.doi.org/10.1007/s11634-015-0206-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Kim, Hyojoong, and Heeyoung Kim. "Functional logistic regression with fused lasso penalty." Journal of Statistical Computation and Simulation 88, no. 15 (2018): 2982–99. http://dx.doi.org/10.1080/00949655.2018.1491975.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Wu, Tong Tong, and Kenneth Lange. "Coordinate descent algorithms for lasso penalized regression." Annals of Applied Statistics 2, no. 1 (2008): 224–44. http://dx.doi.org/10.1214/07-aoas147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Hastie, Trevor, Jonathan Taylor, Robert Tibshirani, and Guenther Walther. "Forward stagewise regression and the monotone lasso." Electronic Journal of Statistics 1 (2007): 1–29. http://dx.doi.org/10.1214/07-ejs004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Guo, Jianhua, Jianchang Hu, Bing-Yi Jing, and Zhen Zhang. "Spline-Lasso in High-Dimensional Linear Regression." Journal of the American Statistical Association 111, no. 513 (2016): 288–97. http://dx.doi.org/10.1080/01621459.2015.1005839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Tibshirani, Robert. "Regression Shrinkage and Selection Via the Lasso." Journal of the Royal Statistical Society: Series B (Methodological) 58, no. 1 (1996): 267–88. http://dx.doi.org/10.1111/j.2517-6161.1996.tb02080.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!