To see the other types of publications on this topic, follow the link: Ridge regression estimators.

Journal articles on the topic 'Ridge regression estimators'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Ridge regression estimators.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Khalaf, G., Kristofer Månsson, and Ghazi Shukur. "Modified Ridge Regression Estimators." Communications in Statistics - Theory and Methods 42, no. 8 (April 15, 2013): 1476–87. http://dx.doi.org/10.1080/03610926.2011.593285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yasin, Seyab, Sultan Salem, Hamdi Ayed, Shahid Kamal, Muhammad Suhail, and Yousaf Ali Khan. "Modified Robust Ridge M-Estimators in Two-Parameter Ridge Regression Model." Mathematical Problems in Engineering 2021 (September 22, 2021): 1–24. http://dx.doi.org/10.1155/2021/1845914.

Full text
Abstract:
The methods of two-parameter ridge and ordinary ridge regression are very sensitive to the presence of the joint problem of multicollinearity and outliers in the y-direction. To overcome this problem, modified robust ridge M-estimators are proposed. The new estimators are then compared with the existing ones by means of extensive Monte Carlo simulations. According to mean squared error (MSE) criterion, the new estimators outperform the least square estimator, ridge regression estimator, and two-parameter ridge estimator in many considered scenarios. Two numerical examples are also presented to illustrate the simulation results.
APA, Harvard, Vancouver, ISO, and other styles
3

Cessie, S. Le, and J. C. Van Houwelingen. "Ridge Estimators in Logistic Regression." Applied Statistics 41, no. 1 (1992): 191. http://dx.doi.org/10.2307/2347628.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zinodiny, S. "Bayes minimax ridge regression estimators." Communications in Statistics - Theory and Methods 47, no. 22 (March 7, 2018): 5519–33. http://dx.doi.org/10.1080/03610926.2017.1397167.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wu, Jibo, and Chaolin Liu. "Performance of Some Stochastic Restricted Ridge Estimator in Linear Regression Model." Journal of Applied Mathematics 2014 (2014): 1–7. http://dx.doi.org/10.1155/2014/508793.

Full text
Abstract:
This paper considers several estimators for estimating the stochastic restricted ridge regression estimators. A simulation study has been conducted to compare the performance of the estimators. The result from the simulation study shows that stochastic restricted ridge regression estimators outperform mixed estimator. A numerical example has been also given to illustrate the performance of the estimators.
APA, Harvard, Vancouver, ISO, and other styles
6

DEVITA, HANY, I. KOMANG GDE SUKARSA, and I. PUTU EKA N. KENCANA. "KINERJA JACKKNIFE RIDGE REGRESSION DALAM MENGATASI MULTIKOLINEARITAS." E-Jurnal Matematika 3, no. 4 (November 28, 2014): 146. http://dx.doi.org/10.24843/mtk.2014.v03.i04.p077.

Full text
Abstract:
Ordinary least square is a parameter estimations for minimizing residual sum of squares. If the multicollinearity was found in the data, unbias estimator with minimum variance could not be reached. Multicollinearity is a linear correlation between independent variabels in model. Jackknife Ridge Regression(JRR) as an extension of Generalized Ridge Regression (GRR) for solving multicollinearity. Generalized Ridge Regression is used to overcome the bias of estimators caused of presents multicollinearity by adding different bias parameter for each independent variabel in least square equation after transforming the data into an orthoghonal form. Beside that, JRR can reduce the bias of the ridge estimator. The result showed that JRR model out performs GRR model.
APA, Harvard, Vancouver, ISO, and other styles
7

Lukman, Adewale F., B. M. Golam Kibria, Kayode Ayinde, and Segun L. Jegede. "Modified One-Parameter Liu Estimator for the Linear Regression Model." Modelling and Simulation in Engineering 2020 (August 19, 2020): 1–17. http://dx.doi.org/10.1155/2020/9574304.

Full text
Abstract:
Motivated by the ridge regression (Hoerl and Kennard, 1970) and Liu (1993) estimators, this paper proposes a modified Liu estimator to solve the multicollinearity problem for the linear regression model. This modification places this estimator in the class of the ridge and Liu estimators with a single biasing parameter. Theoretical comparisons, real-life application, and simulation results show that it consistently dominates the usual Liu estimator. Under some conditions, it performs better than the ridge regression estimators in the smaller MSE sense. Two real-life data are analyzed to illustrate the findings of the paper and the performances of the estimators assessed by MSE and the mean squared prediction error. The application result agrees with the theoretical and simulation results.
APA, Harvard, Vancouver, ISO, and other styles
8

Arashi, M., S. M. M. Tabatabaey, and M. Hassanzadeh Bashtian. "Shrinkage Ridge Estimators in Linear Regression." Communications in Statistics - Simulation and Computation 43, no. 4 (October 11, 2013): 871–904. http://dx.doi.org/10.1080/03610918.2012.718838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Xu, Jianwen, and Hu Yang. "Preliminary test almost unbiased ridge estimator in a linear regression model with multivariate Student-t errors." Acta et Commentationes Universitatis Tartuensis de Mathematica 15, no. 1 (December 11, 2020): 27–43. http://dx.doi.org/10.12697/acutm.2011.15.03.

Full text
Abstract:
In this paper, the preliminary test almost unbiased ridge estimators of the regression coefficients based on the conflicting Wald (W), Likelihood ratio (LR) and Lagrangian multiplier (LM) tests in a multiple regression model with multivariate Student-t errors are introduced when it is suspected that the regression coefficients may be restricted to a subspace. The bias and quadratic risks of the proposed estimators are derived and compared. Sufficient conditions on the departure parameter ∆ and the ridge parameter k are derived for the proposed estimators to be superior to the almost unbiased ridge estimator, restricted almost unbiased ridge estimator and preliminary test estimator. Furthermore, some graphical results are provided to illustrate theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
10

Bhat, S. S., and R. Vidya. "Performance of Ridge Estimators Based on Weighted Geometric Mean and Harmonic Mean." Journal of Scientific Research 12, no. 1 (January 1, 2020): 1–13. http://dx.doi.org/10.3329/jsr.v12i1.40525.

Full text
Abstract:
Ordinary least squares estimator (OLS) becomes unstable if there is a linear dependence between any two predictors. When such situation arises ridge estimator will yield more stable estimates to the regression coefficients than OLS estimator. Here we suggest two modified ridge estimators based on weights, where weights being the first two largest eigen values. We compare their MSE with some of the existing ridge estimators which are defined in the literature. Performance of the suggested estimators is evaluated empirically for a wide range of degree of multicollinearity. Simulation study indicates that the performance of the suggested estimators is slightly better and more stable with respect to degree of multicollinearity, sample size, and error variance.
APA, Harvard, Vancouver, ISO, and other styles
11

Kelly, R. J. "GDOP, Ridge Regression and the Kalman Filter." Journal of Navigation 43, no. 03 (September 1990): 409–27. http://dx.doi.org/10.1017/s0373463300014041.

Full text
Abstract:
Multicollinearity and its effect on parameter estimators such as the Kalman filter is analysed using the navigation application as a special example. All position-fix navigation systems suffer loss of accuracy when their navigation landmarks are nearly collinear. Nearly collinear measurement geometry is termed the geometric dilution of position (GDOP). Its presence causes the errors of the position estimates to be highly inflated. In 1970 Hoerl and Kennard developed ridge regression to combat near collinearity when it arises in the predictor matrix of a linear regression model. Since GDOP is mathematically equivalent to a nearly collinear predictor matrix, Kelly suggested using ridge regression techniques in navigation signal processors to reduce the effects of GDOP. The original programme intended to use ridge regression not only to reduce variance inflation but also to reduce bias inflation. Reducing bias inflation is an extension of Hoerl's ridge concept by Kelly. Preliminary results show that ridge regression will reduce the effects of variance inflation caused by GDOP. However, recent results (Kelly) conclude it will not reduce bias inflation as it arises in the navigation problem, GDOP is not a mismatched estimator/model problem. Even with an estimator matched to the model, GDOP may inflate the MSE of the ordinary Kalman filter while the ridge recursive filter chooses a suitable biased estimator that will reduce the MSE. The main goal is obtaining a smaller MSE for the estimator, rather than minimizing the residual sum of squares. This is a different operation than tuning the Kalman filter's dynamic process noise covariance Q, in order to compensate for unmodelled errors. Although ridge regression has not yielded a satisfactory solution to the general GDOP problem, it has provided insight into exactly what causes multicollinearity in navigation signal processors such as the Kalman filter and under what conditions an estimator's performance can be improved.
APA, Harvard, Vancouver, ISO, and other styles
12

Özbay, Nimet, Issam Dawoud, and Selahattin Kaçıranlar. "Feasible Generalized Stein-Rule Restricted Ridge Regression Estimators." Journal of Applied Mathematics, Statistics and Informatics 13, no. 1 (May 24, 2017): 77–97. http://dx.doi.org/10.1515/jamsi-2017-0005.

Full text
Abstract:
Abstract Several versions of the Stein-rule estimators of the coefficient vector in a linear regression model are proposed in the literature. In the present paper, we propose new feasible generalized Stein-rule restricted ridge regression estimators to examine multicollinearity and autocorrelation problems simultaneously for the general linear regression model, when certain additional exact restrictions are placed on these coefficients. Moreover, a Monte Carlo simulation experiment is performed to investigate the performance of the proposed estimator over the others.
APA, Harvard, Vancouver, ISO, and other styles
13

Tiwari, Manoj, and Amit Sharma. "Predictive efficiency of ridge regression estimator." Yugoslav Journal of Operations Research 27, no. 2 (2017): 243–47. http://dx.doi.org/10.2298/yjor170114014t.

Full text
Abstract:
In this article we have considered the problem of prediction within and outside the sample for actual and average values of the study variables in case of ordinary least squares and ridge regression estimators. Finally, the performance properties of the estimators are analyzed.
APA, Harvard, Vancouver, ISO, and other styles
14

Gorgees, Hazim Mansoor, and Fatimah Assim Mahdi. "The Comparison Between Different Approaches to Overcome the Multicollinearity Problem in Linear Regression Models." Ibn AL- Haitham Journal For Pure and Applied Science 31, no. 1 (May 14, 2018): 212. http://dx.doi.org/10.30526/31.1.1841.

Full text
Abstract:
In the presence of multi-collinearity problem, the parameter estimation method based on the ordinary least squares procedure is unsatisfactory. In 1970, Hoerl and Kennard insert analternative method labeled as estimator of ridge regression. In such estimator, ridge parameter plays an important role in estimation. Various methods were proposed by many statisticians to select the biasing constant (ridge parameter). Another popular method that is used to deal with the multi-collinearity problem is the principal component method. In this paper,we employ the simulation technique to compare the performance of principal component estimator with some types of ordinary ridge regression estimators based on the value of the biasing constant (ridge parameter). The mean square error (MSE) is used as a criterion to assess the performance of such estimators.
APA, Harvard, Vancouver, ISO, and other styles
15

Saleh, A. K. Md E., and B. M. Golam Kibria. "Improved ridge regression estimators for the logistic regression model." Computational Statistics 28, no. 6 (April 19, 2013): 2519–58. http://dx.doi.org/10.1007/s00180-013-0417-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Roozbeh, Mahdi. "Shrinkage ridge estimators in semiparametric regression models." Journal of Multivariate Analysis 136 (April 2015): 56–74. http://dx.doi.org/10.1016/j.jmva.2015.01.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Casella, George. "Condition Numbers and Minimax Ridge Regression Estimators." Journal of the American Statistical Association 80, no. 391 (September 1985): 753–58. http://dx.doi.org/10.1080/01621459.1985.10478180.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Samkar, Hatice, and Ozlem Alpu. "Ridge Regression Based on Some Robust Estimators." Journal of Modern Applied Statistical Methods 9, no. 2 (November 1, 2010): 495–501. http://dx.doi.org/10.22237/jmasm/1288584960.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Kibria, B. M. Golam, and Shipra Banik. "Some Ridge Regression Estimators and Their Performances." Journal of Modern Applied Statistical Methods 15, no. 1 (May 1, 2016): 206–38. http://dx.doi.org/10.22237/jmasm/1462075860.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Kibria, B. M. Golam. "Performance of Some New Ridge Regression Estimators." Communications in Statistics - Simulation and Computation 32, no. 2 (January 6, 2003): 419–35. http://dx.doi.org/10.1081/sac-120017499.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Kibria, B. M. Golam, Kristofer Månsson, and Ghazi Shukur. "Performance of Some Logistic Ridge Regression Estimators." Computational Economics 40, no. 4 (July 15, 2011): 401–14. http://dx.doi.org/10.1007/s10614-011-9275-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Harrison Oghenekevwe, Etaga, Aforka Kenechukwu Florence, Awopeju Kabiru Abidemi, and Etaga Njideka Cecilia. "Poisson Ridge Regression Estimators: A Performance Test." American Journal of Theoretical and Applied Statistics 10, no. 2 (2021): 111. http://dx.doi.org/10.11648/j.ajtas.20211002.13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Arashi, M., M. Roozbeh, N. A. Hamzah, and M. Gasparini. "Ridge regression and its applications in genetic studies." PLOS ONE 16, no. 4 (April 8, 2021): e0245376. http://dx.doi.org/10.1371/journal.pone.0245376.

Full text
Abstract:
With the advancement of technology, analysis of large-scale data of gene expression is feasible and has become very popular in the era of machine learning. This paper develops an improved ridge approach for the genome regression modeling. When multicollinearity exists in the data set with outliers, we consider a robust ridge estimator, namely the rank ridge regression estimator, for parameter estimation and prediction. On the other hand, the efficiency of the rank ridge regression estimator is highly dependent on the ridge parameter. In general, it is difficult to provide a satisfactory answer about the selection for the ridge parameter. Because of the good properties of generalized cross validation (GCV) and its simplicity, we use it to choose the optimum value of the ridge parameter. The GCV function creates a balance between the precision of the estimators and the bias caused by the ridge estimation. It behaves like an improved estimator of risk and can be used when the number of explanatory variables is larger than the sample size in high-dimensional problems. Finally, some numerical illustrations are given to support our findings.
APA, Harvard, Vancouver, ISO, and other styles
24

Aladeitan, Benedicta B., Olukayode Adebimpe, Adewale F. Lukman, Olajumoke Oludoun, and Oluwakemi E. Abiodun. "Modified Kibria-Lukman (MKL) estimator for the Poisson Regression Model: application and simulation." F1000Research 10 (July 8, 2021): 548. http://dx.doi.org/10.12688/f1000research.53987.1.

Full text
Abstract:
Background: Multicollinearity greatly affects the Maximum Likelihood Estimator (MLE) efficiency in both the linear regression model and the generalized linear model. Alternative estimators to the MLE include the ridge estimator, the Liu estimator and the Kibria-Lukman (KL) estimator, though literature shows that the KL estimator is preferred. Therefore, this study sought to modify the KL estimator to mitigate the Poisson Regression Model with multicollinearity. Methods: A simulation study and a real-life study were carried out and the performance of the new estimator was compared with some of the existing estimators. Results: The simulation result showed the new estimator performed more efficiently than the MLE, Poisson Ridge Regression Estimator (PRE), Poisson Liu Estimator (PLE) and the Poisson KL (PKL) estimators. The real-life application also agreed with the simulation result. Conclusions: In general, the new estimator performed more efficiently than the MLE, PRE, PLE and the PKL when multicollinearity was present.
APA, Harvard, Vancouver, ISO, and other styles
25

Li, Yalian, and Hu Yang. "Two Classes of Almost Unbiased Type Principal Component Estimators in Linear Regression Model." Journal of Applied Mathematics 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/639070.

Full text
Abstract:
This paper is concerned with the parameter estimator in linear regression model. To overcome the multicollinearity problem, two new classes of estimators called the almost unbiased ridge-type principal component estimator (AURPCE) and the almost unbiased Liu-type principal component estimator (AULPCE) are proposed, respectively. The mean squared error matrix of the proposed estimators is derived and compared, and some properties of the proposed estimators are also discussed. Finally, a Monte Carlo simulation study is given to illustrate the performance of the proposed estimators.
APA, Harvard, Vancouver, ISO, and other styles
26

Kibria, B. M. Golam, and Adewale F. Lukman. "A New Ridge-Type Estimator for the Linear Regression Model: Simulations and Applications." Scientifica 2020 (April 28, 2020): 1–16. http://dx.doi.org/10.1155/2020/9758378.

Full text
Abstract:
The ridge regression-type (Hoerl and Kennard, 1970) and Liu-type (Liu, 1993) estimators are consistently attractive shrinkage methods to reduce the effects of multicollinearity for both linear and nonlinear regression models. This paper proposes a new estimator to solve the multicollinearity problem for the linear regression model. Theory and simulation results show that, under some conditions, it performs better than both Liu and ridge regression estimators in the smaller MSE sense. Two real-life (chemical and economic) data are analyzed to illustrate the findings of the paper.
APA, Harvard, Vancouver, ISO, and other styles
27

Kayanan, Manickavasagar, and Pushpakanthie Wijekoon. "Stochastic Restricted Biased Estimators in Misspecified Regression Model with Incomplete Prior Information." Journal of Probability and Statistics 2018 (2018): 1–8. http://dx.doi.org/10.1155/2018/1452181.

Full text
Abstract:
The analysis of misspecification was extended to the recently introduced stochastic restricted biased estimators when multicollinearity exists among the explanatory variables. The Stochastic Restricted Ridge Estimator (SRRE), Stochastic Restricted Almost Unbiased Ridge Estimator (SRAURE), Stochastic Restricted Liu Estimator (SRLE), Stochastic Restricted Almost Unbiased Liu Estimator (SRAULE), Stochastic Restricted Principal Component Regression Estimator (SRPCRE), Stochastic Restricted r-k (SRrk) class estimator, and Stochastic Restricted r-d (SRrd) class estimator were examined in the misspecified regression model due to missing relevant explanatory variables when incomplete prior information of the regression coefficients is available. Further, the superiority conditions between estimators and their respective predictors were obtained in the mean square error matrix (MSEM) sense. Finally, a numerical example and a Monte Carlo simulation study were used to illustrate the theoretical findings.
APA, Harvard, Vancouver, ISO, and other styles
28

Aladeitan, BENEDICTA, Adewale F. Lukman, Esther Davids, Ebele H. Oranye, and Golam B. M. Kibria. "Unbiased K-L estimator for the linear regression model." F1000Research 10 (August 19, 2021): 832. http://dx.doi.org/10.12688/f1000research.54990.1.

Full text
Abstract:
Background: In the linear regression model, the ordinary least square (OLS) estimator performance drops when multicollinearity is present. According to the Gauss-Markov theorem, the estimator remains unbiased when there is multicollinearity, but the variance of its regression estimates become inflated. Estimators such as the ridge regression estimator and the K-L estimators were adopted as substitutes to the OLS estimator to overcome the problem of multicollinearity in the linear regression model. However, the estimators are biased, though they possess a smaller mean squared error when compared to the OLS estimator. Methods: In this study, we developed a new unbiased estimator using the K-L estimator and compared its performance with some existing estimators theoretically, simulation wise and by adopting real-life data. Results: Theoretically, the estimator even though unbiased also possesses a minimum variance when compared with other estimators. Results from simulation and real-life study showed that the new estimator produced smaller mean square error (MSE) and had the smallest mean square prediction error (MSPE). This further strengthened the findings of the theoretical comparison using both the MSE and the MSPE as criterion. Conclusions: By simulation and using a real-life application that focuses on modelling, the high heating values of proximate analysis was conducted to support the theoretical findings. This new method of estimation is recommended for parameter estimation with and without multicollinearity in a linear regression model.
APA, Harvard, Vancouver, ISO, and other styles
29

Adewoye, Kunle Bayo, Ayinla Bayo Rafiu, Titilope Funmilayo Aminu, and Isaac Oluyemi Onikola. "INVESTIGATING THE IMPACT OF MULTICOLLINEARITY ON LINEAR REGRESSION ESTIMATES." MALAYSIAN JOURNAL OF COMPUTING 6, no. 1 (March 9, 2021): 698. http://dx.doi.org/10.24191/mjoc.v6i1.10540.

Full text
Abstract:
Multicollinearity is a case of multiple regression in which the predictor variables are themselves highly correlated. The aim of the study was to investigate the impact of multicollinearity on linear regression estimates. The study was guided by the following specific objectives, (i) to examined the asymptotic properties of estimators and (ii) to compared lasso, ridge, elastic net with ordinary least squares. The study employed Monte-carlo simulation to generate set of highly collinear and induced multicollinearity variables with sample sizes of 25, 50, 100, 150, 200, 250, 1000 as a source of data in this research work and the data was analyzed with lasso, ridge, elastic net and ordinary least squares using statistical package. The study findings revealed that absolute bias of ordinary least squares was consistent at all sample sizes as revealed by past researched on multicollinearity as well while lasso type estimators were fluctuate alternately. Also revealed that, mean square error of ridge regression was outperformed other estimators with minimum variance at small sample size and ordinary least squares was the best at large sample size. The study recommended that ols was asymptotically consistent at a specified sample sizes on this research work and ridge regression was efficient at small and moderate sample size.
APA, Harvard, Vancouver, ISO, and other styles
30

Deng, Wen-Shuenn, Chih-Kang Chu, and Ming-Yen Cheng. "A study of local linear ridge regression estimators." Journal of Statistical Planning and Inference 93, no. 1-2 (February 2001): 225–38. http://dx.doi.org/10.1016/s0378-3758(00)00161-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Kamel, Maie M., and Sarah F. Aboud. "Ridge regression estimators with the problem of multicollinearity." Applied Mathematical Sciences 7 (2013): 2469–80. http://dx.doi.org/10.12988/ams.2013.13223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Batah, Feras Sh M., M. Revan Özkale, and S. D. Gore. "Combining Unbiased Ridge and Principal Component Regression Estimators." Communications in Statistics - Theory and Methods 38, no. 13 (June 16, 2009): 2201–9. http://dx.doi.org/10.1080/03610920802503396.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Saleh, A. K. Md E., and B. M. Golam Kibria. "On some ridge regression estimators: a nonparametric approach." Journal of Nonparametric Statistics 23, no. 3 (September 2011): 819–51. http://dx.doi.org/10.1080/10485252.2011.567335.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

El-Salam. "A Modification of the Ridge Type Regression Estimators." American Journal of Applied Sciences 8, no. 1 (January 1, 2011): 97–102. http://dx.doi.org/10.3844/ajassp.2011.97.102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Kubokawa, Tatsuya, and Muni S. Srivastava. "Improved Empirical Bayes Ridge Regression Estimators Under Multicollinearity." Communications in Statistics - Theory and Methods 33, no. 8 (December 31, 2004): 1943–73. http://dx.doi.org/10.1081/sta-120037452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Buitendag, Sven, Jan Beirlant, and Tertius de Wet. "Ridge regression estimators for the extreme value index." Extremes 22, no. 2 (October 10, 2018): 271–92. http://dx.doi.org/10.1007/s10687-018-0338-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Muniz, Gisela, and B. M. Golam Kibria. "On Some Ridge Regression Estimators: An Empirical Comparisons." Communications in Statistics - Simulation and Computation 38, no. 3 (February 19, 2009): 621–30. http://dx.doi.org/10.1080/03610910802592838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Liu, Xu-Qing, Feng Gao, and Zhen-Feng Yu. "Improved ridge estimators in a linear regression model." Journal of Applied Statistics 40, no. 1 (November 5, 2012): 209–20. http://dx.doi.org/10.1080/02664763.2012.740623.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Tracy, Derrick S., and Anil K. Srivastava. "Selection buasubg parameters in adaptive ridge regression estimators." Econometric Reviews 11, no. 3 (January 1992): 367–77. http://dx.doi.org/10.1080/07474939208800246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Lukman, Adewale F., Issam Dawoud, B. M. Golam Kibria, Zakariya Y. Algamal, and Benedicta Aladeitan. "A New Ridge-Type Estimator for the Gamma Regression Model." Scientifica 2021 (June 18, 2021): 1–8. http://dx.doi.org/10.1155/2021/5545356.

Full text
Abstract:
The known linear regression model (LRM) is used mostly for modelling the QSAR relationship between the response variable (biological activity) and one or more physiochemical or structural properties which serve as the explanatory variables mainly when the distribution of the response variable is normal. The gamma regression model is employed often for a skewed dependent variable. The parameters in both models are estimated using the maximum likelihood estimator (MLE). However, the MLE becomes unstable in the presence of multicollinearity for both models. In this study, we propose a new estimator and suggest some biasing parameters to estimate the regression parameter for the gamma regression model when there is multicollinearity. A simulation study and a real-life application were performed for evaluating the estimators' performance via the mean squared error criterion. The results from simulation and the real-life application revealed that the proposed gamma estimator produced lower MSE values than other considered estimators.
APA, Harvard, Vancouver, ISO, and other styles
41

Lukman, Adewale F., Kayode Ayinde, B. M. Golam Kibria, and Segun L. Jegede. "Two-Parameter Modified Ridge-Type M-Estimator for Linear Regression Model." Scientific World Journal 2020 (May 15, 2020): 1–24. http://dx.doi.org/10.1155/2020/3192852.

Full text
Abstract:
The general linear regression model has been one of the most frequently used models over the years, with the ordinary least squares estimator (OLS) used to estimate its parameter. The problems of the OLS estimator for linear regression analysis include that of multicollinearity and outliers, which lead to unfavourable results. This study proposed a two-parameter ridge-type modified M-estimator (RTMME) based on the M-estimator to deal with the combined problem resulting from multicollinearity and outliers. Through theoretical proofs, Monte Carlo simulation, and a numerical example, the proposed estimator outperforms the modified ridge-type estimator and some other considered existing estimators.
APA, Harvard, Vancouver, ISO, and other styles
42

Qasim, Muhammad, Kristofer Månsson, Muhammad Amin, B. M. Golam Kibria, and Pär Sjölander. "Biased Adjusted Poisson Ridge Estimators-Method and Application." Iranian Journal of Science and Technology, Transactions A: Science 44, no. 6 (October 3, 2020): 1775–89. http://dx.doi.org/10.1007/s40995-020-00974-5.

Full text
Abstract:
AbstractMånsson and Shukur (Econ Model 28:1475–1481, 2011) proposed a Poisson ridge regression estimator (PRRE) to reduce the negative effects of multicollinearity. However, a weakness of the PRRE is its relatively large bias. Therefore, as a remedy, Türkan and Özel (J Appl Stat 43:1892–1905, 2016) examined the performance of almost unbiased ridge estimators for the Poisson regression model. These estimators will not only reduce the consequences of multicollinearity but also decrease the bias of PRRE and thus perform more efficiently. The aim of this paper is twofold. Firstly, to derive the mean square error properties of the Modified Almost Unbiased PRRE (MAUPRRE) and Almost Unbiased PRRE (AUPRRE) and then propose new ridge estimators for MAUPRRE and AUPRRE. Secondly, to compare the performance of the MAUPRRE with the AUPRRE, PRRE and maximum likelihood estimator. Using both simulation study and real-world dataset from the Swedish football league, it is evidenced that one of the proposed, MAUPRRE ($$ \hat{k}_{q4} $$ k ^ q 4 ) performed better than the rest in the presence of high to strong (0.80–0.99) multicollinearity situation.
APA, Harvard, Vancouver, ISO, and other styles
43

Lukman, Adewale F., Kayode Ayinde, Olajumoke Oludoun, and Clement A. Onate. "Combining modified ridge-type and principal component regression estimators." Scientific African 9 (September 2020): e00536. http://dx.doi.org/10.1016/j.sciaf.2020.e00536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Blagus, Rok, and Jelle J. Goeman. "Mean squared error of ridge estimators in logistic regression." Statistica Neerlandica 74, no. 2 (May 2020): 159–91. http://dx.doi.org/10.1111/stan.12201.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Najarian, S., M. Arashi, and B. M. Golam Kibria. "A Simulation Study on Some Restricted Ridge Regression Estimators." Communications in Statistics - Simulation and Computation 42, no. 4 (April 2013): 871–90. http://dx.doi.org/10.1080/03610918.2012.659953.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Månsson, Kristofer. "On ridge estimators for the negative binomial regression model." Economic Modelling 29, no. 2 (March 2012): 178–84. http://dx.doi.org/10.1016/j.econmod.2011.09.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Pliskin, Jeffrey. "A further theoretical result for generalized ridge regression estimators." Economics Letters 26, no. 2 (January 1988): 133–35. http://dx.doi.org/10.1016/0165-1765(88)90028-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Ohtani, K. "Generalized ridge regression estimators under the LINEX loss function." Statistical Papers 36, no. 1 (December 1995): 99–110. http://dx.doi.org/10.1007/bf02926024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Chaturvedi, Ancop. "Ridge regression estimators in the linear regression models with non-spherical errors." Communications in Statistics - Theory and Methods 22, no. 8 (January 1993): 2275–84. http://dx.doi.org/10.1080/03610929308831147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

LI, R., F. LI, and J. W. HUANG. "THE PREDICTIVE PERFORMANCE EVALUATION AND NUMERICAL EXAMPLE STUDY FOR THE PRINCIPAL COMPONENT TWO-PARAMETERS ESTIMATOR." Latin American Applied Research - An international journal 48, no. 3 (October 8, 2019): 181–86. http://dx.doi.org/10.52292/j.laar.2018.223.

Full text
Abstract:
In this paper, detailed comparisons are given between those estimators that can be derived from the principal component two-parameter estimator such as the ordinary least squares estimator, the principal components regression estimator, the ridge regression estimator, the Liu estimator, the r-k estimator and the r-d estimator by the prediction mean square error criterion. In addition, conditions for the superiority of the principal component two-parameter estimator over the others are obtained. Furthermore, a numerical example study is conducted to compare these estimators under the prediction mean squared error criterion.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography