To see the other types of publications on this topic, follow the link: Multicollinearity.

Journal articles on the topic 'Multicollinearity'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Multicollinearity.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Tsagris, Michail, and Nikolaos Pandis. "Multicollinearity." American Journal of Orthodontics and Dentofacial Orthopedics 159, no. 5 (May 2021): 695–96. http://dx.doi.org/10.1016/j.ajodo.2021.02.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Alin, Aylin. "Multicollinearity." Wiley Interdisciplinary Reviews: Computational Statistics 2, no. 3 (March 8, 2010): 370–74. http://dx.doi.org/10.1002/wics.84.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ohyver, Margaretha, and Herena Pudjihastuti. "Pemodelan Tingkat Penghunian Kamar Hotel di Kendari dengan Transformasi Wavelet Kontinu dan Partial Least Squares." ComTech: Computer, Mathematics and Engineering Applications 5, no. 2 (December 1, 2014): 1178. http://dx.doi.org/10.21512/comtech.v5i2.2435.

Full text
Abstract:
Multicollinearity and outliers are the common problems when estimating regression model. Multicollinearitiy occurs when there are high correlations among predictor variables, leading to difficulties in separating the effects of each independent variable on the response variable. While, if outliers are present in the data to be analyzed, then the assumption of normality in the regression will be violated and the results of the analysis may be incorrect or misleading. Both of these cases occurred in the data on room occupancy rate of hotels in Kendari. The purpose of this study is to find a model for the data that is free of multicollinearity and outliers and to determine the factors that affect the level of room occupancy hotels in Kendari. The method used is Continuous Wavelet Transformation and Partial Least Squares. The result of this research is a regression model that is free of multicollinearity and a pattern of data that resolved the present of outliers.
APA, Harvard, Vancouver, ISO, and other styles
4

Higgins, J., and J. Gruber. "Multicollinearity and Biased Estimation." Statistician 35, no. 3 (1986): 401. http://dx.doi.org/10.2307/2987767.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jurczyk, Tomáš. "Outlier detection under multicollinearity." Journal of Statistical Computation and Simulation 82, no. 2 (February 2012): 261–78. http://dx.doi.org/10.1080/00949655.2011.638634.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kelava, Augustin, Helfried Moosbrugger, Polina Dimitruk, and Karin Schermelleh-Engel. "Multicollinearity and Missing Constraints." Methodology 4, no. 2 (January 2008): 51–66. http://dx.doi.org/10.1027/1614-2241.4.2.51.

Full text
Abstract:
Multicollinearity complicates the simultaneous estimation of interaction and quadratic effects in structural equation modeling (SEM). So far, approaches developed within the Kenny-Judd (1984 ) tradition have failed to specify additional and necessary constraints on the measurement error covariances of the nonlinear indicators. Given that the constraints comprise, in part, latent linear predictor correlations, multicollinearity poses a problem for such approaches. Klein and Moosbrugger’s (2000 ) latent moderated structural equations approach (LMS) approach does not utilize nonlinear indicators and should therefore not be affected by this problem. In the context of a simulation study, we varied predictor correlation and the number of nonlinear effects in order to compare the performance of three approaches developed for the estimation of simultaneous nonlinear effects: Ping’s (1996 ) two-step approach, a correctly extended Jöreskog-Yang (1996 ) approach, and LMS. Results show that in contrast to the Jöreskog-Yang approach and LMS, the two-step approach produces biased parameter estimates and false inferences under heightened multicollinearity. Ping’s approach resulted in overestimated interaction effects and underestimated quadratic effects.
APA, Harvard, Vancouver, ISO, and other styles
7

Öztürk, Fikri, and Fikri Akdeniz. "Ill-conditioning and multicollinearity." Linear Algebra and its Applications 321, no. 1-3 (December 2000): 295–305. http://dx.doi.org/10.1016/s0024-3795(00)00147-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Winship, Christopher, and Bruce Western. "Multicollinearity and Model Misspecification." Sociological Science 3 (2016): 627–49. http://dx.doi.org/10.15195/v3.a27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Daoud, Jamal I. "Multicollinearity and Regression Analysis." Journal of Physics: Conference Series 949 (December 2017): 012009. http://dx.doi.org/10.1088/1742-6596/949/1/012009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

GILBERT, C. L. "THE DIAGNOSIS OF MULTICOLLINEARITY*." Oxford Bulletin of Economics and Statistics 40, no. 2 (May 1, 2009): 87–91. http://dx.doi.org/10.1111/j.1468-0084.1978.mp40002001.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Markovitz, Barry P. "The Principle of Multicollinearity." Pediatric Critical Care Medicine 6, no. 1 (January 2005): 94. http://dx.doi.org/10.1097/01.pcc.0000149232.77830.e0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Sari, Bruno Giacomini, Alessandro Dal’Col Lúcio, Tiago Olivoto, Dionatan Ketzer Krysczun, André Luís Tischler, and Lucas Drebes. "Interference of sample size on multicollinearity diagnosis in path analysis." Pesquisa Agropecuária Brasileira 53, no. 6 (June 2018): 769–73. http://dx.doi.org/10.1590/s0100-204x2018000600014.

Full text
Abstract:
Abstract: The objective of this work was to evaluate the interference of sample size on multicollinearity diagnosis in path analysis. From the analyses of productive traits of cherry tomato, two Pearson correlation matrices were obtained, one with severe multicollinearity and the other with weak multicollinearity. Sixty-six sample sizes were designed, and from the amplitude of the bootstrap confidence interval, it was observed that sample size interfered on multicollinearity diagnosis. When sample size was small, the imprecision of the diagnostic criteria estimates interfered with multicollinearity diagnosis in the matrix with weak multicollinearity.
APA, Harvard, Vancouver, ISO, and other styles
13

Bayman, Emine Ozgur, and Franklin Dexter. "Multicollinearity in Logistic Regression Models." Anesthesia & Analgesia 133, no. 2 (July 14, 2021): 362–65. http://dx.doi.org/10.1213/ane.0000000000005593.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Kim, Jong Hae. "Multicollinearity and misleading statistical results." Korean Journal of Anesthesiology 72, no. 6 (December 1, 2019): 558–69. http://dx.doi.org/10.4097/kja.19087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Schroeder, Mary Ann, Janice Lander, and Stacey Levine-Silverman. "Diagnosing and Dealing with Multicollinearity." Western Journal of Nursing Research 12, no. 2 (April 1990): 175–87. http://dx.doi.org/10.1177/019394599001200204.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Erfle, Stephen. "A geometric approach to multicollinearity." Journal of Economic Education 50, no. 2 (March 21, 2019): 213. http://dx.doi.org/10.1080/00220485.2019.1582385.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Stratmann, William C., Thomas R. Zastowny, Leonard R. Bayer, Edgar H. Adams, Gordon S. Black, and Polly A. Fry. "Patient satisfaction surveys and multicollinearity." Quality Management in Health Care 2, no. 2 (1994): 1–12. http://dx.doi.org/10.1097/00019514-199402020-00005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Stratmann, William C., Thomas R. Zastowny, Leonard R. Bayer, Edgar H. Adams, Gordon S. Black, and Polly A. Fry. "Patient satisfaction surveys and multicollinearity." Quality Management in Health Care 2, no. 2 (1994): 1–12. http://dx.doi.org/10.1097/00019514-199424000-00005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Shrestha, Noora. "Detecting Multicollinearity in Regression Analysis." American Journal of Applied Mathematics and Statistics 8, no. 2 (June 15, 2020): 39–42. http://dx.doi.org/10.12691/ajams-8-2-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Willis, Cleve E., and Bruce E. Lindsay. "LAND VALUE RESEARCH AND MULTICOLLINEARITY." Canadian Journal of Agricultural Economics/Revue canadienne d'agroeconomie 23, no. 1 (November 13, 2008): 75–78. http://dx.doi.org/10.1111/j.1744-7976.1975.tb00945.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Reed, D., D. McGee, K. Yano, and J. Hankin. "Diet, blood pressure, and multicollinearity." Hypertension 7, no. 3_pt_1 (May 1985): 405–10. http://dx.doi.org/10.1161/01.hyp.7.3.405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Reed, D., D. McGee, K. Yano, and J. Hankin. "Diet, blood pressure, and multicollinearity." Hypertension 7, no. 3_Pt_1 (May 1, 1985): 405–10. http://dx.doi.org/10.1161/01.hyp.7.3_pt_1.405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Yu, Han, Shanhe Jiang, and Kenneth C. Land. "Multicollinearity in hierarchical linear models." Social Science Research 53 (September 2015): 118–36. http://dx.doi.org/10.1016/j.ssresearch.2015.04.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Lauridsen, Jørgen, and Jesùs Mur. "Multicollinearity in cross-sectional regressions." Journal of Geographical Systems 8, no. 4 (September 5, 2006): 317–33. http://dx.doi.org/10.1007/s10109-006-0031-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Srinivasan, Vijay, and Vinay Nadkarni. "Reply: The Principle of Multicollinearity." Pediatric Critical Care Medicine 6, no. 1 (January 2005): 94–95. http://dx.doi.org/10.1097/01.pcc.0000149315.11984.f4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Chandrasekhar, C. K., H. Bagyalakshmi, M. R. Srinivasan, and M. Gallo. "Partial ridge regression under multicollinearity." Journal of Applied Statistics 43, no. 13 (May 8, 2016): 2462–73. http://dx.doi.org/10.1080/02664763.2016.1181726.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Chang, Yu-Ching, and Christina Mastrangelo. "Addressing multicollinearity in semiconductor manufacturing." Quality and Reliability Engineering International 27, no. 6 (January 14, 2011): 843–54. http://dx.doi.org/10.1002/qre.1173.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Sallas, M., and L. Thalassinos. ""A Test for Detecting Multicollinearity"." Indian Economic Journal 38, no. 4 (June 1991): 109–11. http://dx.doi.org/10.1177/0019466219910408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Kayode Ayinde, Olusegun O. Alabi and Ugochinyere Ihuoma Nwosu. "Solving Multicollinearity Problem in Linear Regression Model: The Review Suggests New Idea of Partitioning and Extraction of the Explanatory Variables." Journal of Mathematics and Statistics Studies 2, no. 1 (February 18, 2021): 12–20. http://dx.doi.org/10.32996/jmss.2021.2.1.2.

Full text
Abstract:
Multicollinearity has remained a major problem in regression analysis and should be sustainably addressed. Problems associated with multicollinearity are worse when it occurs at high level among regressors. This review revealed that studies on the subject have focused on developing estimators regardless of effect of differences in levels of multicollinearity among regressors. Studies have considered single-estimator and combined-estimator approaches without sustainable solution to multicollinearity problems. The possible influence of partitioning the regressors according to multicollinearity levels and extracting from each group to develop estimators that will estimate the parameters of a linear regression model when multicollinearity occurs is a new econometrics idea and therefore requires attention. The results of new studies should be compared with existing methods namely principal components estimator, partial least squares estimator, ridge regression estimator and the ordinary least square estimators using wide range of criteria by ranking their performances at each level of multicollinearity parameter and sample size. Based on a recent clue in literature, it is possible to develop innovative estimator that will sustainably solve the problem of multicollinearity through partitioning and extraction of explanatory variables approaches and identify situations where the innovative estimator will produce most efficient result of the model parameters. The new estimator should be applied to real data and popularized for use.
APA, Harvard, Vancouver, ISO, and other styles
30

Kusuma, Fita Mega, and Arief Wibowo. "Principal Component Analysis (PCA) untuk Mengatasi Multikolinieritas terhadap Faktor Angka Kejadian Pneumonia Balita di Jawa Timur Tahun 2014." Jurnal Biometrika dan Kependudukan 6, no. 2 (October 30, 2018): 89. http://dx.doi.org/10.20473/jbk.v6i2.2017.89-97.

Full text
Abstract:
Correlation between independent variables in multiple linear regression model called multicollinearity. One of the assumptions of multiple linear regression free from multicollinearity problem. Principal Component Analysis (PCA) method in this study aims to overcome the existence of multicollinearity in multiple linear regression and know the dominant factor to the research. PCA method has the advantage of clearing the correlation without losing the original variable. Case study a risk factor that affects the incidence of pneumonia infants in East Java 2014. This non reactive research because uses publication data of health profil of East Java. Result of this research multicollinearity problem in research data when detected by VIF/tolerance method. Variable of vitamin A coverage, measles immunization coverage and health service coverage are the variables that observed multicollinearity. A multicollinearity solution produces (F1) or new variable(coverage of vitamin A, immunization measles and health service), reduction of three variables that multicollinearity to not multicollinearity with VIF value of 1.608 < 10. Results of this study also proves the weakness of PCA method in analyzing the significance. PCA method shows the most influencing factors on the incidence of pneumonia of children under five year. Dominant factor in this research coverage of infant health services covering, coverage of vitamin A and coverage of measles immunization. Coverage factor of health services has a coefficient matrix value of 0.890 or 89% more influential than other factor.
APA, Harvard, Vancouver, ISO, and other styles
31

Ismaeel, Shelan Saied, Habshah Midi, and Muhammed Sani. "Robust Multicollinearity Diagnostic Measure For Fixed Effect Panel Data Model." Malaysian Journal of Fundamental and Applied Sciences 17, no. 5 (October 30, 2021): 636–46. http://dx.doi.org/10.11113/mjfas.v17n5.2391.

Full text
Abstract:
It is now evident that high leverage points (HLPs) can induce the multicollinearity pattern of a data in fixed effect panel data model. Those observations that are responsible for this phenomenon are called high leverage collinearity-enhancing observations (HLCEO). The commonly used within group ordinary least squares (WOLS) estimator for estimating the parameters of fixed effect panel data model is easily affected by HLCEOs. In their presence, the WOLS estimates may produce large variances and this would lead to erroneous interpretation. Therefore, it is imperative to detect the multicollinearity which is caused by HLCEOs. The classical Variance Inflation Factor (CVIF) is the commonly used diagnostic method for detecting multicollinearity in panel data. However, it is not correctly diagnosed multicollinearity in the presence of HLCEOs. Hence, in this paper three new robust diagnostic methods of diagnosing multicollinearity in panel data are proposed, namely the RVIF (WGM-FIMGT), RVIF (WGM-DRGP) and RVIF (WMM) and compared their performances with the CVIF. The numerical evidences show that the CVIF incorrectly diagnosed multicollinearity but our proposed methods correctly diagnosed no multicollinearity in the presence of HLCEOs where RVIF (WGM-FIMGT) being the best method as it has the least computational running time.
APA, Harvard, Vancouver, ISO, and other styles
32

Purnama Sari, Kusman Sadik, and Mulianto Raharjo. "Perbandingan Performa Metode Pohon Model Logistik dan Random Forest pada Pengklasifikasian Data." Xplore: Journal of Statistics 12, no. 1 (January 15, 2023): 36–49. http://dx.doi.org/10.29244/xplore.v12i1.858.

Full text
Abstract:
Multicollinearity and missing data are two common problems in big data. Missing data could decrease the prediction accuracy. Logistic model tree (LMT) is used to handle multicollinearity because multicollinearity does not affect the decision tree. Random forest can be used to decrease variance in prediction case. This study aimed to study the comparison of two methods, LMT and random forest, in multicollinearity and missing data in various cases using simulation study and real data as dataset. Evaluation model is based on classification accuracy and AUC measurement. The result stated that random forest had better performance if the multicollinearity level is moderate. LMT with omitted missing data is proven to have better performance for big data and when a high percentage of missing data occurred, and the multicollinearity level is severe. The next step is analysed real data with different sample size. The result stated that random forest have better performance. Omitted missing data have better performance in classification “breast cancer” data which consist 0,3 % missing data.
APA, Harvard, Vancouver, ISO, and other styles
33

Yanke, Aldino, Nofrida Elly Zendrato, and Agus M. Soleh. "Handling Multicollinearity Problems in Indonesia's Economic Growth Regression Modeling Based on Endogenous Economic Growth Theory." Indonesian Journal of Statistics and Its Applications 6, no. 2 (August 31, 2022): 228–44. http://dx.doi.org/10.29244/ijsa.v6i2p214-230.

Full text
Abstract:
One of the multiple linear regression applications in economics is Indonesia’s economic growth model based on the theory of endogenous economic growth. Endogenous economic theory is the development of classical theory which cannot explain how the economy grows in the long run. The regression model based on the theory of endogenous economic growth used many independent variables, which caused multicollinearity problems. In this study, the multiple linear regression model using the least-squares estimation method and some methods to handle the multicollinearity problem was implemented. Variable selection methods (backward, forward, and stepwise), principal component regression (PCR), partial least square (PLS), and regularization methods (Ridge, Lasso, and Elastic Net) were applied to solve the multicollinearity problem. Variable selection method with backward, forward, and stepwise has not been able to overcome the problem of multicollinearity. In contrast, Principal Component Regression, PLS regression, and regularization regression methods overcame the multicollinearity problem. We used "leave one out cross-validation" (LOOCV) to determine the best method for handling multicollinearity problems with the smallest mean square of error (MSE). Based on the MSE value, the best method to overcome the multicollinearity problem in the economic growth model based on endogenous economic growth theory was the Lasso regression method.
APA, Harvard, Vancouver, ISO, and other styles
34

Shieh, Gwowen. "On the Misconception of Multicollinearity in Detection of Moderating Effects: Multicollinearity Is Not Always Detrimental." Multivariate Behavioral Research 45, no. 3 (May 28, 2010): 483–507. http://dx.doi.org/10.1080/00273171.2010.483393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Thomas, Sabu K., and K. T. Thomachen. "Biodiversity Studies and Multicollinearity in Multivariate Data Analysis." Mapana - Journal of Sciences 6, no. 1 (May 31, 2007): 27–35. http://dx.doi.org/10.12723/mjs.10.2.

Full text
Abstract:
Multicollinearity of explanatory variables often threatens statistical interpretation of ecological data analysis in biodiversity studies. Using litter ants as an example,the impact of multicollinearity on ecological multiple regression and complications arsing from collinearity is explained.We list the various statistical techniques available for enhancing the reliability and interpretation of ecological multiple regressions in the presence of multicollinearity.
APA, Harvard, Vancouver, ISO, and other styles
36

UTAMI, NI KETUT TRI, and I. KOMANG GDE SUKARSA. "PENERAPAN METODE GENERALIZED RIDGE REGRESSION DALAM MENGATASI MASALAH MULTIKOLINEARITAS." E-Jurnal Matematika 2, no. 1 (January 30, 2013): 54. http://dx.doi.org/10.24843/mtk.2013.v02.i01.p029.

Full text
Abstract:
Ordinary least square is parameter estimation method for linier regression analysis by minimizing residual sum of square. In the presence of multicollinearity, estimators which are unbiased and have a minimum variance can not be generated. Multicollinearity refers to a situation where regressor variables are highly correlated. Generalized Ridge Regression is an alternative method to deal with multicollinearity problem. In Generalized Ridge Regression, different biasing parameters for each regressor variables were added to the least square equation after transform the data to the space of orthogonal regressors. The analysis showed that Generalized Ridge Regression was satisfactory to overcome multicollinearity.
APA, Harvard, Vancouver, ISO, and other styles
37

Chan, Jireh Yi-Le, Steven Mun Hong Leow, Khean Thye Bea, Wai Khuen Cheng, Seuk Wai Phoong, Zeng-Wei Hong, and Yen-Lin Chen. "Mitigating the Multicollinearity Problem and Its Machine Learning Approach: A Review." Mathematics 10, no. 8 (April 12, 2022): 1283. http://dx.doi.org/10.3390/math10081283.

Full text
Abstract:
Technologies have driven big data collection across many fields, such as genomics and business intelligence. This results in a significant increase in variables and data points (observations) collected and stored. Although this presents opportunities to better model the relationship between predictors and the response variables, this also causes serious problems during data analysis, one of which is the multicollinearity problem. The two main approaches used to mitigate multicollinearity are variable selection methods and modified estimator methods. However, variable selection methods may negate efforts to collect more data as new data may eventually be dropped from modeling, while recent studies suggest that optimization approaches via machine learning handle data with multicollinearity better than statistical estimators. Therefore, this study details the chronological developments to mitigate the effects of multicollinearity and up-to-date recommendations to better mitigate multicollinearity.
APA, Harvard, Vancouver, ISO, and other styles
38

RIYANTINI, DWI LARAS, MADE SUSILAWATI, and KARTIKA SARI. "PENERAPAN REGRESI AKAR LATEN DALAM MENANGANI MULTIKOLINEARITAS PADA MODEL REGRESI LINIER BERGANDA." E-Jurnal Matematika 3, no. 1 (January 31, 2014): 8. http://dx.doi.org/10.24843/mtk.2014.v03.i01.p060.

Full text
Abstract:
Multicollinearity is a problem that often occurs in multiple linear regression. The existence of multicollinearity in the independent variables resulted in a regression model obtained is far from accurate. Latent root regression is an alternative in dealing with the presence of multicollinearity in multiple linear regression. In the latent root regression, multicollinearity was overcome by reducing the original variables into new variables through principal component analysis techniques. In this regression the estimation of parameters is modified least squares method. In this study, the data used are eleven groups of simulated data with varying number of independent variables. Based on the VIF value and the value of correlation, latent root regression is capable of handling multicollinearity completely. On the other hand, a regression model that was obtained by latent root regression has value of 0.99, which indicates that the independent variables can explain the diversity of the response variables accurately.
APA, Harvard, Vancouver, ISO, and other styles
39

Gwelo, Abubakari S. "PRINCIPAL COMPONENTS TO OVERCOME MULTICOLLINEARITY PROBLEM." Oradea Journal of Business and Economics 4, no. 1 (March 2019): 79–91. http://dx.doi.org/10.47535/1991ojbe062.

Full text
Abstract:
The impact of ignoring collinearity among predictors is well documented in a statistical literature. An attempt has been made in this study to document application of Principal components as remedial solution to this problem. Using a sample of six hundred participants, linear regression model was fitted and collinearity between predictors was detected using Variance Inflation Factor (VIF). After confirming the existence of high relationship between independent variables, the principal components was utilized to find the possible linear combination of variables that can produce large variance without much loss of information. Thus, the set of correlated variables were reduced into new minimum number of variables which are independent on each other but contained linear combination of the related variables. In order to check the presence of relationship between predictors, dependent variables were regressed on these five principal components. The results show that VIF values for each predictor ranged from 1 to 3 which indicates that multicollinearity problem was eliminated. Finally another linear regression model was fitted using Principal components as predictors. The assessment of relationship between predictors indicated that no any symptoms of multicollinearity were observed. The study revealed that principal component analysis is one of the appropriate methods of solving the collinearity among variables. Therefore this technique produces better estimation and prediction than ordinary least squares when predictors are related. The study concludes that principal component analysis is appropriate method of solving this matter.
APA, Harvard, Vancouver, ISO, and other styles
40

Tamura, Ryuta, Ken Kobayashi, Yuichi Takano, Ryuhei Miyashiro, Kazuhide Nakata, and Tomomi Matsui. "BEST SUBSET SELECTION FOR ELIMINATING MULTICOLLINEARITY." Journal of the Operations Research Society of Japan 60, no. 3 (2017): 321–36. http://dx.doi.org/10.15807/jorsj.60.321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Graham, Michael H. "CONFRONTING MULTICOLLINEARITY IN ECOLOGICAL MULTIPLE REGRESSION." Ecology 84, no. 11 (November 2003): 2809–15. http://dx.doi.org/10.1890/02-3114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Bizeti, Henrique Stoco, Claudio Guilherme Portela de Carvalho, José Roberto Pinto de Souza, and Deonisio Destro. "Path analysis under multicollinearity in soybean." Brazilian Archives of Biology and Technology 47, no. 5 (September 2004): 669–76. http://dx.doi.org/10.1590/s1516-89132004000500001.

Full text
Abstract:
This study aimed to establish the phenotypic correlations among several soybean traits with grain yield in direct and indirect effects using path analysis, and to compare alternative methods for minimizing the adverse effects of multicollinearity in estimating path coefficients. The experiment was conducted in greenhouse in a randomized complete block design with four replications. Nine soybean genotypes belonging to three seed size categories were used. The correlation studies and the path analysis showed that the seed size was not important for increased yield. The number of nodes and plant height at maturity showed significant correlation with grain yield. Using the least square methodology, the results obtained by path analysis under multicollinearity were not satisfactory. The ridge path analysis and the trait culling were efficient in reducing the adverse effects of multicollinearity. Both methods showed that only the number of nodes at maturity trait had a high direct effect on grain yield per plant.
APA, Harvard, Vancouver, ISO, and other styles
43

Yoshioka, Shigeru. "Multicollinearity and Avoidance in Regression Analysis." Behaviormetrika 13, no. 19 (January 1986): 103–20. http://dx.doi.org/10.2333/bhmk.13.19_103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Montgomery, Douglas C., and Sheila R. Voth. "Multicollinearity and Leverage in Mixture Experiments." Journal of Quality Technology 26, no. 2 (April 1994): 96–108. http://dx.doi.org/10.1080/00224065.1994.11979510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Maxwell, Obubu, George Amaeze Osuji, Ibeakuzie Precious Onyedikachi, Chinelo Ijeoma Obi-Okpala, Ikediuwa Udoka Chinedu, and Okpala Ikenna Frank. "Handling Critical Multicollinearity Using Parametric Approach." Academic Journal of Applied Mathematical Sciences, no. 511 (November 5, 2019): 150–63. http://dx.doi.org/10.32861/ajams.511.150.163.

Full text
Abstract:
In regression analysis, it is relatively necessary to have a correlation between the response and explanatory variables, but having correlations amongst explanatory variables is something undesired. This paper focuses on five methodologies for handling critical multicollinearity, they include: Partial Least Square Regression (PLSR), Ridge Regression (RR), Ordinary Least Square Regression (OLS), Least Absolute Shrinkage and Selector Operator (LASSO) Regression, and the Principal Component Analysis (PCA). Monte Carlo Simulations comparing the methods was carried out with the sample size greater than or equal to the levels (n>p) considered in most cases, the Average Mean Square Error (AMSE) and Akaike Information Criterion (AIC) values were computed. The result shows that PCR is the most superior and more efficient in handling critical multicollinearity problems, having the lowest AMSE and AIC values for all the sample sizes and different levels considered.
APA, Harvard, Vancouver, ISO, and other styles
46

Dias Curto, José, and José Castro Pinto. "Correction Note on New Multicollinearity Indicators." International Statistical Review 76, no. 2 (August 2008): 298–99. http://dx.doi.org/10.1111/j.1751-5823.2008.00052.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Guan, Ying, and Guang-Hui Fu. "A Double-Penalized Estimator to Combat Separation and Multicollinearity in Logistic Regression." Mathematics 10, no. 20 (October 16, 2022): 3824. http://dx.doi.org/10.3390/math10203824.

Full text
Abstract:
When developing prediction models for small or sparse binary data with many highly correlated covariates, logistic regression often encounters separation or multicollinearity problems, resulting serious bias and even the nonexistence of standard maximum likelihood estimates. The combination of separation and multicollinearity makes the task of logistic regression more difficult, and a few studies addressed separation and multicollinearity simultaneously. In this paper, we propose a double-penalized method called lFRE to combat separation and multicollinearity in logistic regression. lFRE combines the logF-type penalty with the ridge penalty. The results indicate that compared with other penalty methods, lFRE can not only effectively remove bias from predicted probabilities but also provide the minimum mean squared prediction error. Aside from that, a real dataset is also employed to test the performance of the lFRE algorithm compared with several existing methods. The result shows that lFRE has strong competitiveness compared with them and can be used as an alternative algorithm in logistic regression to solve separation and multicollinearity problems.
APA, Harvard, Vancouver, ISO, and other styles
48

DEVITA, HANY, I. KOMANG GDE SUKARSA, and I. PUTU EKA N. KENCANA. "KINERJA JACKKNIFE RIDGE REGRESSION DALAM MENGATASI MULTIKOLINEARITAS." E-Jurnal Matematika 3, no. 4 (November 28, 2014): 146. http://dx.doi.org/10.24843/mtk.2014.v03.i04.p077.

Full text
Abstract:
Ordinary least square is a parameter estimations for minimizing residual sum of squares. If the multicollinearity was found in the data, unbias estimator with minimum variance could not be reached. Multicollinearity is a linear correlation between independent variabels in model. Jackknife Ridge Regression(JRR) as an extension of Generalized Ridge Regression (GRR) for solving multicollinearity. Generalized Ridge Regression is used to overcome the bias of estimators caused of presents multicollinearity by adding different bias parameter for each independent variabel in least square equation after transforming the data into an orthoghonal form. Beside that, JRR can reduce the bias of the ridge estimator. The result showed that JRR model out performs GRR model.
APA, Harvard, Vancouver, ISO, and other styles
49

Et. al., Alhassan Umar Ahmad,. "A Study of Multicollinearity Detection and Rectification under Missing Values." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 1S (April 11, 2021): 399–418. http://dx.doi.org/10.17762/turcomat.v12i1s.1880.

Full text
Abstract:
In this paper, the consequences of missing observations on data-based multicollinearity were analyzed. Different missing values has a different effect on multicollinearity in the system of multiple regression model. Therefore, to ascertain the clear relationship between both multicollinearity and skipping values on monotone and arbitrary missing values, the collinear effects were potentially studied on two types of missing values. Similarly, the comparison was done to investigate each response of multicollinearity on each pattern of the missing values with the same informatics data. It was found that tolerance and variance inflation factor fluctuates due to the missing of information from the sample analyzed at a different percentages of the missing values.It was observed that the more missing values available in the sample obtain from either population statistics or survey than multicollinearity will be found in the system of multiple regression, this is because as the number of Missingness increase it shows a drastic decrease from the tolerance level on both monotone and arbitrary types as observed from the analysis.
APA, Harvard, Vancouver, ISO, and other styles
50

Herawati, Netti, Subian Saidi, and Dorrah Azis. "RIDGE LEAST ABSOLUTE DEVIATION PERFORMANCE IN ADDRESSING MULTICOLLINEARITY AND DIFFERENT LEVELS OF OUTLIER SIMULTANEOUSLY." BAREKENG: Jurnal Ilmu Matematika dan Terapan 16, no. 3 (September 1, 2022): 779–86. http://dx.doi.org/10.30598/barekengvol16iss3pp779-786.

Full text
Abstract:
If there is multicollinearity and outliers in the data, the inference about parameter estimation in the LS method will deviate due to the inefficiency of this method in estimating. To overcome these two problems simultaneously, it can be done using robust regression, one of which is ridge least absolute deviation method. This study aims to evaluate the performance of the ridge least absolute deviation method in surmounting multicollinearity in divers sample sizes and percentage of outliers using simulation data. The Monte Carlo study was designed in a multiple regression model with multicollinearity (ρ=0.99) between variables and and outliers 10%, 20%, 30% on response variables with different sample sizes (n = 25, 50,75,100,200; =0, and β=1 otherwise). The existence of multicollinearity in the data is done by calculating the correlation value between the independent variables and the VIF value. Outlier detection is done by using boxplot. Parameter estimation was carried out using the RLAD and LS methods. Furthermore, a comparison of the MSE values of the two methods is carried out to see which method is better in overcoming multicollinearity and outliers. The results showed that RLAD had a lower MSE than LS. This signifies that RLAD is more precise in estimating the regression coefficients for each sample size and various outlier levels studied.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography