To see the other types of publications on this topic, follow the link: OLS Regression Method.

Journal articles on the topic 'OLS Regression Method'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'OLS Regression Method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Usman, M., S. I. S. Doguwa, and B. B. Alhaji. "Comparing the Prediction Accuracy of Ridge, Lasso and Elastic Net Regression Models with Linear Regression Using Breast Cancer Data." Bayero Journal of Pure and Applied Sciences 14, no. 2 (2022): 134–49. http://dx.doi.org/10.4314/bajopas.v14i2.16.

Full text
Abstract:
Regularised regression methods have been developed in the past to overcome the shortcomings of ordinarily least squares (OLS) regression of not performing well with respect to both prediction accuracy and model complexity. OLS method may fail or produce regression estimates with high variance in the presence of multi-collinearity or when the predictor variables are greater than the number of observations. This study compares the predictive performance and additional information gained of Ridge, Lasso and Elastic net regularised methods with the classical OLS method using data of breast cancer patients. The findings have shown that using all the predictor variables, the OLS method failed because of the presence of multiple collinearity, while regularised Ridge, Lasso and Elastic net methods produced results that showed the predictor variables mostly significant. Using the training data, the Elastic net and Lasso seemed to indicate more significant predictor variables than the Ridge method. The result also indicated that breast cancer patients in age groups 30-39, those that are married and in stage1 of the disease, have longer survival times, while patients that are in stage2 and stage3 have shorter survival times. The OLS regression produced results only when four of the predictor variables were excluded; even then, the regularised methods still outperformed the OLS regression in terms of prediction accuracy.
APA, Harvard, Vancouver, ISO, and other styles
2

Kim, Jaejin, and Johnson Ching-Hong Li. "Which robust regression technique is appropriate under violated assumptions? A simulation study." Methodology 19, no. 4 (2023): 323–47. http://dx.doi.org/10.5964/meth.8285.

Full text
Abstract:
Ordinary least squares (OLS) regression is widely employed for statistical prediction and theoretical explanation in psychology studies. However, OLS regression has a critical drawback: it becomes less accurate in the presence of outliers and non-random error distribution. Several robust regression methods have been proposed as alternatives. However, each robust regression has its own strengths and limitations. Consequently, researchers are often at a loss as to which robust regression method to use for their studies. This study uses a Monte Carlo experiment to compare different types of robust regression methods with OLS regression based on relative efficiency (RE), bias, root mean squared error (RMSE), Type 1 error, power, coverage probability of the 95% confidence intervals (CIs), and the width of the CIs. The results show that, with sufficient samples per predictor (n = 100), the robust regression methods are as efficient as OLS regression. When errors follow non-normal distributions, i.e., mixed-normal, symmetric and heavy-tailed (SH), asymmetric and relatively light-tailed (AL), asymmetric and heavy-tailed (AH), and heteroscedastic, the robust method (GM-estimation) seems to consistently outperform OLS regression.
APA, Harvard, Vancouver, ISO, and other styles
3

Van Schaeybroeck, B., and S. Vannitsem. "Post-processing through linear regression." Nonlinear Processes in Geophysics 18, no. 2 (2011): 147–60. http://dx.doi.org/10.5194/npg-18-147-2011.

Full text
Abstract:
Abstract. Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.
APA, Harvard, Vancouver, ISO, and other styles
4

Murat, Yazici. "A new approach called Weighted Least Squares Ratio (WLSR) Method to M-estimators." Journal of Information Sciences and Computing Technologies 5, no. 1 (2015): 399–414. https://doi.org/10.5281/zenodo.3968500.

Full text
Abstract:
Regression Analysis (RA) is an important statistical tool that is applied in most sciences. The Ordinary Least Squares (OLS) is a tradition method in RA and there are many regression techniques based on OLS. The Weighted Least Squares(WLS) method is iteratively used in M-estimators. The Least Squares Ratio (LSR) method in RA gives better results than OLS, especially in case of the presence of outliers. This paper includes a new approach to M-estimators, called Weighted Least Squares Ratio (WLSR), and comparison of WLS and WLSR according to mean absolute errors of estimation of the regression parameters (mae ß) and dependent value (mae y).
APA, Harvard, Vancouver, ISO, and other styles
5

Khotimah, Khusnul, Kusman Sadik, and Akbar Rizki. "KAJIAN REGRESI KEKAR MENGGUNAKAN METODE PENDUGA-MM DAN KUADRAT MEDIAN TERKECIL." Indonesian Journal of Statistics and Its Applications 4, no. 1 (2020): 97–115. http://dx.doi.org/10.29244/ijsa.v4i1.502.

Full text
Abstract:
Regression is a statistical method that is used to obtain a pattern of relations between two or more variables presented in the regression line equation. This line equation is derived from estimation using ordinary least squares (OLS). However, OLS has limitations that are highly dependent on outliers data. One solution to the outliers problem in regression analysis is to use the robust regression method. This study used the least median squares (LMS) and multi-stage method (MM) robust regression for analysis of data containing outliers. Data analysis was carried out on generation data simulation and actual data. The simulation results of regression analysis in various scenarios are concluded that the LMS and MM methods have better performance compared to the OLS on data containing outliers. MM method has the lowest average parameter estimation bias, followed by the LMS, then OLS. The LMS has the smallest average root mean squares error (RMSE) and the highest average R2 is followed by the MM then the OLS. The results of the regression analysis comparison of the three methods on Indonesian rice production data in 2017 which contains 10% outliers were concluded that the LMS is the best method. The LMS produces the smallest RMSE of 4.44 and the highest R2 that is 98%. MM's method is in the second-best position with RMSE of 6.78 and R2 of 96%. OLS method produces the largest RMSE and lowest R2 that is 23.15 and 58% respectively.
APA, Harvard, Vancouver, ISO, and other styles
6

Jana, Padrul, Dedi Rosadi, and Epha Diana Supandi. "COMPARISON OF ROBUST ESTIMATION ON MULTIPLE REGRESSION MODEL." BAREKENG: Jurnal Ilmu Matematika dan Terapan 17, no. 2 (2023): 0979–88. http://dx.doi.org/10.30598/barekengvol17iss2pp0979-0988.

Full text
Abstract:
This study aimed to compare the robustness of the OLS method with a robust regression model on data that had outliers. The methods used on the robust regression model were M-estimation, MM-estimation, and S-estimation. The step taken was to check the characteristics of the data against outliers. Furthermore, the data were modeled with and without outliers using the OLS method and the M-, MM-, and S-estimations. The results were very different between the data with and without the outlier models in the OLS method. It was reflected in the intercept and standard error variables generated from the models. Meanwhile, the regression model with the M-, MM-, and S-estimations was quite stable and able to withstand the presence of outliers. Based on the three estimations that were robust against the outliers, the MM-estimation was the best candidate because, in addition to having a stable intercept parameter estimation, it also had the smallest standard error, which was 61.9 in the resulting model.
APA, Harvard, Vancouver, ISO, and other styles
7

Türkyılmaz, Serpil, and Kadriye Nurdanay Öztürk. "Analysis of Factors Affecting CO2 Emissions in Türkiye Using Quantile Regression." Sustainability 16, no. 22 (2024): 9634. http://dx.doi.org/10.3390/su16229634.

Full text
Abstract:
This study aims to show how the impact of factors on carbon dioxide (CO2) emissions differs at the quantile level and to demonstrate the superiority of the quantile regression method over the OLS method by using quantile regression and ordinary least squares (OLS) methods in order to examine the factors affecting CO2 emissions in Türkiye in depth. Covering the period 1990–2021, this study evaluates the relationship between CO2 emissions and GDP per capita growth, population growth, and renewable energy consumption. One of the important findings of the study is that the increase in the population ratio, which is insignificant according to the OLS method, positively affects CO2 emissions at the 0.25 quantile point. According to both OLS and quantile regression methods, GDP growth does not affect CO2 emissions, while renewable energy consumption has a significant and negative effect according to both models. These results demonstrate that economic growth has no discernible impact on CO2 emissions in Türkiye, while investments in renewable energy can significantly lower emissions and open the door for quantile regression to be used more widely in related research. Unlike traditional methods that focus only on the conditional mean, the quantile regression method provides a comprehensive framework for Türkiye’s sustainable development policies by exploring factor effects at different emission levels.
APA, Harvard, Vancouver, ISO, and other styles
8

Imrhan, Sheik N. "A Method of Developing more Realistic Predictive Models." Proceedings of the Human Factors Society Annual Meeting 30, no. 9 (1986): 945–49. http://dx.doi.org/10.1177/154193128603000922.

Full text
Abstract:
This study demonstrates a better method of regression analysis than Ordinary Least Squares (OLS) method under certain conditions. Ridge Regression, as it is called is useful in situations where there are strong intercorrelations among regressor variables – a condition called multicollinearity. When OLS regression is used to model the relationship between the response variable and the regressor variables the model may exhibit some undesirable properties. Using ergonomics data exhibiting multicollinearity the use of the Ridge technique is demonstrated. An OLS model for the same data is also presented and compared with the Ridge models. The Ridge model was superior to the latter. It exhibited more realistic properties and predicted more accurately. It is therefore proposed as a valuable tool to the Human Factors/Ergonomics researcher in the development of regression models with highly intercorrelated regressor variables.
APA, Harvard, Vancouver, ISO, and other styles
9

Ajewole, Kehinde Peter, and Adekunle David Adefolarin. "APPLICATION OF THE MAXIMUM LIKELIHOOD APPROACH TO ESTIMATION OF POLYNOMIAL REGRESSION MODEL." INTERNATIONAL JOURNAL OF MATHEMATICS AND COMPUTER RESEARCH 10, no. 05 (2022): 2693–700. https://doi.org/10.5281/zenodo.6576297.

Full text
Abstract:
The  ordinary  least  squares  (OLS)  method  had  been  extensively  applied  to  estimation  of  different classes  of  regression  model  under  specific  assumptions.  However,  this  estimation  procedure  OLS does  not  perform  well  with  outliers  and  small  sample  sizes.  As  a  result,  this  work  considered  the application of the maximum likelihood method for polynomial regression model using sample sizes as  against  the  large  sample  assumption  in  OLS.  The  efficiency  of  the  maximum  likelihood  (ML) estimation technique  was put to test by comparing its model fit to that of the OLS using some real world data  sets. The  results of analysis of these  data  sets using both  methods  showed  that  the ML outperformed the OLS since it gave better estimates with lower mean square error (MSE) values in all the four data sets considered and higher coefficient of determination (R2) values. Although, both methods resulted in overall good fit, but the ML is more efficient than the OLS because it resulted in lower MSE for small sample sizes.
APA, Harvard, Vancouver, ISO, and other styles
10

Shariff, N. S. M., and H. M. B. Duzan. "A Comparison of OLS and Ridge Regression Methods in the Presence of Multicollinearity Problem in the Data." International Journal of Engineering & Technology 7, no. 4.30 (2018): 36. http://dx.doi.org/10.14419/ijet.v7i4.30.21999.

Full text
Abstract:
The presence of multicollinearity will significantly lead to inconsistent parameter estimates in regression modeling. The common procedure in regression analysis that is Ordinarily Least Squares (OLS) is not robust to multicollinearity problem and will result in inaccurate model. To solve this problem, a number of methods are developed in the literatures and the most common method is ridge regression. Although there are many studies propose variety method to overcome multicolinearity problem in regression analysis, this study proposes the simplest model of ridge regression which is based on linear combinations of the coefficient of the least squares regression of independent variables to determine the value of k (ridge estimator in ridge regression model). The performance of the proposed method is investigated and compared to OLS and some recent existing methods. Thus, simulation studies based on Monte Carlo simulation study are considered. The result of this study is able to produce similar findings as in existing method and outperform OLS in the existence of multicollinearity in the regression modeling.
APA, Harvard, Vancouver, ISO, and other styles
11

Ogunbona, Babafemi D., Folorunsho O. Balogun, and Kayode S. Famuagun. "Solving Multicolinearity Problem in a Linear Regression: A Comparative Study of Ordinary Least Squares and Partial Least Squares Regression." Journal of Institutional Research, Big Data Analytics and Innovation 1, no. 1 (2024): 66–75. https://doi.org/10.5281/zenodo.15556948.

Full text
Abstract:
Ordinary Least Squares (OLS) estimator usually yields inefficient estimates when multicollinearity is present in a Linear Regression Model. The inefficiency of OLS can be mitigated by Partial Least Squares Regression (PLSR). However, this method requires selecting latent variables in order to yield efficient estimates of regression parameters. This paper proposes using weighted standard errors and ranking standard errors of regression coefficients for latent variable extraction, alongside model selection methods such as Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC), Adjusted R Squared (ARS), and Standard Error of Regression Coefficient (SER). Monte-Carlo experiments on a Linear Regression Model with six explanatory variables were conducted one thousand (1000) times across eleven levels of multicollinearity [low (0.0, 0.2 and 0.4), moderate (0.6 and 0.8), high (0.95, 0.96, 0.97, 0.98,) and very high (0.99 and 0.999)] and five levels of sample size [small (20), medium (30 and 50) and large (100 and 250)]. Results indicated that while OLS estimates are preferred overall, Partial Least Squares with BIC becomes preferable at high multicollinearity levels. Additionally, the efficiency of OLS improves with larger sample sizes despite multicollinearity
APA, Harvard, Vancouver, ISO, and other styles
12

Kesavulu, Kesavulu, V. Pavankumari, J. Anil Kumar, Akkyam Vani, Asif Alisha S., and A. Srinivasulu. "Parameter Estimation in Multiple Linear Regression: A Neutrosophic Perspective with the Simple Averaging Method (SAM)." International Journal of Neutrosophic Science 26, no. 2 (2025): 215–28. https://doi.org/10.54216/ijns.260216.

Full text
Abstract:
Regression modeling is a significant statistical tool aimed at quantifying and understanding the nature of relations between the predictor and response variables. The routine parameter estimation procedures, like OLS and ML, are based heavily on the assumption of normality in data, which will not be the case for most real-world data scenarios. The paper presents a Neutrosophic approach for the estimation of parameters in multiple linear regression models, making use of the Neutrosophic principles to treat uncertainties, indeterminacies, and inconsistencies in actual data, a proposed method is called the Simple Averaging Method, or SAM. This is a robust alternative to traditional methods and provides reliable results even if the assumptions of normality are not held. SAM performance is tested using real-time crime data in the USA and demonstrates its capabilities to deal with complex datasets. The comparative analysis between the OLS model and the same model is done via RMSE and MAD metrics. The results show that SAM significantly outperforms OLS with an RMSE of 34.37598 in contrast to 58.05248 for OLS. Graphical analysis further confirms SAM's performance over and above OLS. Critical issues of regression modeling with incorporation of neutrosophic logic cover their critical challenges, especially when standard assumptions are violated.
APA, Harvard, Vancouver, ISO, and other styles
13

Orenti, Annalisa, and Ettore Marubini. "Performance of Robust Regression Methods in Real-Time Polymerase Chain Reaction Calibration." International Journal of Biological Markers 29, no. 4 (2014): 317–27. http://dx.doi.org/10.5301/jbm.5000067.

Full text
Abstract:
The ordinary least squares (OLS) method is routinely used to estimate the unknown concentration of nucleic acids in a given solution by means of calibration. However, when outliers are present it could appear sensible to resort to robust regression methods. We analyzed data from an External Quality Control program concerning quantitative real-time PCR and we found that 24 laboratories out of 40 presented outliers, which occurred most frequently at the lowest concentrations. In this article we investigated and compared the performance of the OLS method, the least absolute deviation (LAD) method, and the biweight MM-estimator in real-time PCR calibration via a Monte Carlo simulation. Outliers were introduced by replacement contamination. When contamination was absent the coverages of OLS and MM-estimator intervals were acceptable and their widths small, whereas LAD intervals had acceptable coverages at the expense of higher widths. In the presence of contamination we observed a trade-off between width and coverage: the OLS performance got worse, the MM-estimator intervals widths remained short (but this was associated with a reduction in coverages), while LAD intervals widths were constantly larger with acceptable coverages at the nominal level.
APA, Harvard, Vancouver, ISO, and other styles
14

Mikusheva, Anna, and Mikkel Sølvsten. "Linear regression with weak exogeneity." Quantitative Economics 16, no. 2 (2025): 367–403. https://doi.org/10.3982/qe2622.

Full text
Abstract:
This paper studies linear time‐series regressions with many regressors. Weak exogeneity is the most used identifying assumption in time series. Weak exogeneity requires the structural error to have zero conditional expectation given present and past regressor values, allowing errors to correlate with future regressor realizations. We show that weak exogeneity in time‐series regressions with many controls may produce substantial biases and render the least squares (OLS) estimator inconsistent. The bias arises in settings with many regressors because the normalized OLS design matrix remains asymptotically random and correlates with the regression error when only weak (but not strict) exogeneity holds. This bias' magnitude increases with the number of regressors and their average autocorrelation. We propose an innovative approach to bias correction that yields a new estimator with improved properties relative to OLS. We establish consistency and conditional asymptotic Gaussianity of this new estimator and provide a method for inference.
APA, Harvard, Vancouver, ISO, and other styles
15

Tang, Yi, Arshad Ali, and Li-Huan Feng. "Bayesian model predicts the aboveground biomass of Caragana microphylla in sandy lands better than OLS regression models." Journal of Plant Ecology 13, no. 6 (2020): 732–37. http://dx.doi.org/10.1093/jpe/rtaa065.

Full text
Abstract:
Abstract Aims In forest ecosystems, different types of regression models have been frequently used for the estimation of aboveground biomass, where Ordinary Least Squares (OLS) regression models are the most common prediction models. Yet, the relative performance of Bayesian and OLS models in predicting aboveground biomass of shrubs, especially multi-stem shrubs, has relatively been less studied in forests. Methods In this study, we developed the biomass prediction models for Caragana microphylla Lam. which is a widely distributed multi-stems shrub, and contributes to the decrease of wind erosion and the fixation of sand dunes in the Horqin Sand Land, one of the largest sand lands in China. We developed six types of formulations under the framework of the regression models, and then, selected the best model based on specific criteria. Consequently, we estimated the parameters of the best model with OLS and Bayesian methods with training and test data under different sample sizes with the bootstrap method. Lastly, we compared the performance of the OLS and Bayesian models in predicting the aboveground biomass of C. microphylla. Important Findings The performance of the allometric equation (power = 1) was best among six types of equations, even though all of those models were significant. The results showed that mean squared error of test data with non-informative prior Bayesian method and the informative prior Bayesian method was lower than with the OLS method. Among the tested predictors (i.e. plant height and basal diameter), we found that basal diameter was not a significant predictor either in OLS or Bayesian methods, indicating that suitable predictors and well-fitted models should be seriously considered. This study highlights that Bayesian methods, the bootstrap method and the type of allometric equation could help to improve the model accuracy in predicting shrub biomass in sandy lands.
APA, Harvard, Vancouver, ISO, and other styles
16

M, Anila, and G. Pradeepini. "Least Square Regression for Prediction Problems in Machine Learning using R." International Journal of Engineering & Technology 7, no. 3.12 (2018): 960. http://dx.doi.org/10.14419/ijet.v7i3.12.17612.

Full text
Abstract:
The most commonly used prediction technique is Ordinary Least Squares Regression (OLS Regression). It has been applied in many fields like statistics, finance, medicine, psychology and economics. Many people, specially Data Scientists using this technique know that it has not gone with enough training to apply it and should be checked why & when it can or can’t be applied.It’s not easy task to find or explain about why least square regression [1] is faced much criticism when trained and tried to apply it. In this paper, we mention firstly about fundamentals of linear regression and OLS regression along with that popularity of LS method, we present our analysis of difficulties & pitfalls that arise while OLS method is applied, finally some techniques for overcoming these problems.
APA, Harvard, Vancouver, ISO, and other styles
17

Daniel, Farida. "MENGATASI PENCILAN PADA PEMODELAN REGRESI LINEAR BERGANDA DENGAN METODE REGRESI ROBUST PENAKSIR LMS." BAREKENG: Jurnal Ilmu Matematika dan Terapan 13, no. 3 (2019): 145–56. http://dx.doi.org/10.30598/barekengvol13iss3pp145-156ar884.

Full text
Abstract:
Ordinary Least Squares (OLS) is frequent used method for estimating parameters. OLS estimator is not a robust regression procedure for the presence of outliers, so the estimate becomes inappropriate. Least Median of Squares (LMS) is one of a robust estimator for the presence of outliers and has a high breakdown value. LMS estimate parameters by minimizing the median of squared residuals. Least Median of Squares (LMS) The purpose of this study is geting a regression equation that better than the regression equation before using OLS for the data that having outlier. For the first step, checking if there is outlier at data and then searching regression equation with LMS method. In this study used data stackloss and from estimation parameter of this data, LMS estimator showed better results compared to the OLS estimator because the regression equation from LMS method have smaller value of Mean Absolute Percentage Error (MAPE).
APA, Harvard, Vancouver, ISO, and other styles
18

Pati, Kafi Dano. "Using Robust Ridge Regression Diagnostic Method to Handle Multicollinearity Caused High Leverage Points." Academic Journal of Nawroz University 10, no. 1 (2021): 326. http://dx.doi.org/10.25007/ajnu.v10n1a578.

Full text
Abstract:
Statistics practitioners have been depending on the ordinary least squares (OLS) method in the linear regression model for generation because of its optimal properties and simplicity of calculation. However, the OLS estimators can be strongly affected by the existence of multicollinearity which is a near linear dependency between two or more independent variables in the regression model. Even though in the presence of multicollinearity the OLS estimate still remained unbiased, they will be inaccurate prediction about the dependent variable with the inflated standard errors of the estimated parameter coefficient of the regression model. It is now evident that the existence of high leverage points which are the outliers in x-direction are the prime factor of collinearity influential observations. In this paper, we proposed some alternative to regression methods for estimating the regression parameter coefficient in the presence of multiple high leverage points which cause the multicollinearity problem. This procedure utilized the ordinary least squares estimates of the parameter as the initial followed by an estimate of the ridge regression. We incorporated the Least Trimmed Squares (LTS) robust regression estimate to down weight the effects of multiple high leverage points which lead to the reduction of the effects of multicollinearity. The result seemed to suggest that the RLTS give a substantial improvement over the Ridge Regression.
APA, Harvard, Vancouver, ISO, and other styles
19

Khan, Sajid Ali, Sayyad Khurshid, Shabnam Arshad, and Owais Mushtaq. "Bias Estimation of Linear Regression Model with Autoregressive Scheme using Simulation Study." Journal of Mathematical Analysis and Modeling 2, no. 1 (2021): 26–39. http://dx.doi.org/10.48185/jmam.v2i1.131.

Full text
Abstract:
In regression modeling, first-order auto correlated errors are often a problem, when the data also suffers from independent variables. Generalized Least Squares (GLS) estimation is no longer the best alternative to Ordinary Least Squares (OLS). The Monte Carlo simulation illustrates that regression estimation using data transformed according to the GLS method provides estimates of the regression coefficients which are superior to OLS estimates. In GLS, we observe that in sample size $200$ and $\sigma$=3 with correlation level $0.90$ the bias of GLS $\beta_0$ is $-0.1737$, which is less than all bias estimates, and in sample size $200$ and $\sigma=1$ with correlation level $0.90$ the bias of GLS $\beta_0$ is $8.6802$, which is maximum in all levels. Similarly minimum and maximum bias values of OLS and GLS of $\beta_1$ are $-0.0816$, $-7.6101$ and $0.1371$, $0.1383$ respectively. The average values of parameters of the OLS and GLS estimation with different size of sample and correlation levels are estimated. It is found that for large samples both methods give similar results but for small sample size GLS is best fitted as compared to OLS.
APA, Harvard, Vancouver, ISO, and other styles
20

Fauzi, Asep Andri, Agus M. Soleh, and Anik Djuraidah. "KAJIAN SIMULASI PERBANDINGAN METODE REGRESI KUADRAT TERKECIL PARSIAL, SUPPORT VECTOR MACHINE, DAN RANDOM FOREST." Indonesian Journal of Statistics and Its Applications 4, no. 1 (2020): 203–15. http://dx.doi.org/10.29244/ijsa.v4i1.610.

Full text
Abstract:
Highly correlated predictors and nonlinear relationships between response and predictors potentially affected the performance of predictive modeling, especially when using the ordinary least square (OLS) method. The simple technique to solve this problem is by using another method such as Partial Least Square Regression (PLSR), Support Vector Regression with kernel Radial Basis Function (SVR-RBF), and Random Forest Regression (RFR). The purpose of this study is to compare OLS, PLSR, SVR-RBF, and RFR using simulation data. The methods were evaluated by the root mean square error prediction (RMSEP). The result showed that in the linear model, SVR-RBF and RFR have large RMSEP; OLS and PLSR are better than SVR-RBF and RFR, and PLSR provides much more stable prediction than OLS in case of highly correlated predictors and small sample size. In nonlinear data, RFR produced the smallest RMSEP when data contains high correlated predictors.
APA, Harvard, Vancouver, ISO, and other styles
21

CAHYANI, NI WAYAN YUNI, I. GUSTI AYU MADE SRINADI, and MADE SUSILAWATI. "PERBANDINGAN TRANSFORMASI BOX-COX DAN REGRESI KUANTIL MEDIAN DALAM MENGATASI HETEROSKEDASTISITAS." E-Jurnal Matematika 4, no. 1 (2015): 8. http://dx.doi.org/10.24843/mtk.2015.v04.i01.p081.

Full text
Abstract:
Ordinary least square (OLS) is a method that can be used to estimate the parameter in linear regression analysis. There are some assumption which should be satisfied on OLS, one of this assumption is homoscedasticity, that is the variance of error is constant. If variance of the error is unequal that so-called heteroscedasticity. The presence heteroscedasticity can cause estimation with OLS becomes inefficient. Therefore, heteroscedasticity shall be overcome. There are some method that can used to overcome heteroscedasticity, two among those are Box-Cox power transformation and median quantile regression. This research compared Box-Cox power transformation and median quantile regression to overcome heteroscedasticity. Applied Box-Cox power transformation on OLS result ????2point are greater, smaller RMSE point and confidencen interval more narrow, therefore can be concluded that applied of Box-Cox power transformation on OLS better of median quantile regression to overcome heteroscedasticity.
APA, Harvard, Vancouver, ISO, and other styles
22

Lin, Jingying, and Caio Almeida. "American option pricing with machine learning: An extension of the Longstaff-Schwartz method." Brazilian Review of Finance 19, no. 3 (2021): 85–109. http://dx.doi.org/10.12660/rbfin.v19n3.2021.83815.

Full text
Abstract:
Pricing American options accurately is of great theoretical and practical importance. We propose using machine learning methods, including support vector regression and classification and regression trees. These more advanced techniques extend the traditional Longstaff-Schwartz approach, replacing the OLS regression step in the Monte Carlo simulation. We apply our approach to both simulated data and market data from the S&P 500 Index option market in 2019. Our results suggest that support vector regression can be an alternative to the existing OLS-based pricing method, requiring fewer simulations and reducing the vulnerability to misspecification of basis functions.
APA, Harvard, Vancouver, ISO, and other styles
23

Schulte-Hostedde, Albrecht I., Bertram Zinner, John S. Millar, and Graham J. Hickling. "RESTITUTION OF MASS–SIZE RESIDUALS: VALIDATING BODY CONDITION INDICES." Ecology 86, no. 1 (2005): 155–63. https://doi.org/10.5281/zenodo.13510421.

Full text
Abstract:
(Uploaded by Plazi for the Bat Literature Project) Body condition can have important fitness consequences, but measuring body condition of live animals from wild populations has been the subject of much recent debate. Using the residuals from a regression of body mass on a linear measure of body size is one of the most common methods of measuring condition and has been used in many vertebrate taxa. Recently, the use of this method has been criticized because assumptions are likely violated. We tested several assumptions regarding the use of this method with body composition and morphometric data from five species of small mammals and with statistical simulations. We tested the assumptions that the relationship between body mass and body size is linear, and that the proportion of mass associated with energy reserves is independent of body size. In addition, we tested whether the residuals from reduced major axis (RMA) regression or major axis (MA) regression performed better than the residuals from ordinary least squares (OLS) regression as indices of body condition. We found no evidence of nonlinear relationships between body mass and body size. Relative energy reserves (fat and lean dry mass) were generally independent or weakly dependent on body size. Residuals from MA and RMA regression consistently explained less variation in body composition than OLS regression. Using statistical simulations, we compared the effects of violations of the assumption that true condition and residual indices are independent of body size on the OLS, MA, and RMA procedures and found that OLS performed better than the RMA and MA procedures. Despite recent criticisms of residuals from mass–size OLS regressions, these indices of body condition appear to satisfy critical assumptions. Although some caution is warranted when using residuals, especially when both interindividual variation in body size and measurement error are high, we found no reason to reject OLS residuals as legitimate indices of body condition.
APA, Harvard, Vancouver, ISO, and other styles
24

Schulte-Hostedde, Albrecht I., Bertram Zinner, John S. Millar, and Graham J. Hickling. "RESTITUTION OF MASS–SIZE RESIDUALS: VALIDATING BODY CONDITION INDICES." Ecology 86, no. 1 (2005): 155–63. https://doi.org/10.5281/zenodo.13510421.

Full text
Abstract:
(Uploaded by Plazi for the Bat Literature Project) Body condition can have important fitness consequences, but measuring body condition of live animals from wild populations has been the subject of much recent debate. Using the residuals from a regression of body mass on a linear measure of body size is one of the most common methods of measuring condition and has been used in many vertebrate taxa. Recently, the use of this method has been criticized because assumptions are likely violated. We tested several assumptions regarding the use of this method with body composition and morphometric data from five species of small mammals and with statistical simulations. We tested the assumptions that the relationship between body mass and body size is linear, and that the proportion of mass associated with energy reserves is independent of body size. In addition, we tested whether the residuals from reduced major axis (RMA) regression or major axis (MA) regression performed better than the residuals from ordinary least squares (OLS) regression as indices of body condition. We found no evidence of nonlinear relationships between body mass and body size. Relative energy reserves (fat and lean dry mass) were generally independent or weakly dependent on body size. Residuals from MA and RMA regression consistently explained less variation in body composition than OLS regression. Using statistical simulations, we compared the effects of violations of the assumption that true condition and residual indices are independent of body size on the OLS, MA, and RMA procedures and found that OLS performed better than the RMA and MA procedures. Despite recent criticisms of residuals from mass–size OLS regressions, these indices of body condition appear to satisfy critical assumptions. Although some caution is warranted when using residuals, especially when both interindividual variation in body size and measurement error are high, we found no reason to reject OLS residuals as legitimate indices of body condition.
APA, Harvard, Vancouver, ISO, and other styles
25

Mahaboob, B., B. Venkateswarlu, C. Narayana, J. Ravi sankar, and P. Balasiddamuni. "A Monograph on Nonlinear Regression Models." International Journal of Engineering & Technology 7, no. 4.10 (2018): 543. http://dx.doi.org/10.14419/ijet.v7i4.10.21277.

Full text
Abstract:
This research article uses Matrix Calculus techniques to study least squares application of nonlinear regression model, sampling distributions of nonlinear least squares estimators of regression parametric vector and error variance and testing of general nonlinear hypothesis on parameters of nonlinear regression model. Arthipova Irina et.al [1], in this paper, discussed some examples of different nonlinear models and the application of OLS (Ordinary Least Squares). MA Tabati et.al (2), proposed a robust alternative technique to OLS nonlinear regression method which provide accurate parameter estimates when outliers and/or influential observations are present. Xu Zheng et.al [3] presented new parametric tests for heteroscedasticity in nonlinear and nonparametric models.
APA, Harvard, Vancouver, ISO, and other styles
26

Isgiarahmah, Afryda, Rito Goejantoro, and Yuki Novia Nasution. "Estimasi Parameter Model Regresi Linier Berganda dengan Pendekatan Bayes Menggunakan Prior Pseudo." EKSPONENSIAL 12, no. 1 (2021): 1. http://dx.doi.org/10.30872/eksponensial.v12i1.753.

Full text
Abstract:
The parameter estimation of a regression model can use the Ordinary Least Square (OLS) method which must fulfill the assumption of BLUE. Besides OLS, there is another method that can be used to estimate the regression parameters, namely the Bayes method. Parameter estimates using the OLS method and the Bayes method have been widely used in the field of development. One of them is on economic development, namely the Human Development Index (HDI). The purpose of this study is to know multiple linear regression models and interpretations that state the relationship between per capita expenditure, average length of school, life expectancy, and school length for the Human Development Index (HDI) with the Bayes approach using pseudo priors.
APA, Harvard, Vancouver, ISO, and other styles
27

Srinadi, I. Gusti Ayu Made. "Model Partial Least Square Regression (PLSR) Pengaruh Bidang Pendidikan dan Ekonomi Terhadap Tingkat Kemiskinan di Indonesia." Jurnal Matematika 7, no. 1 (2017): 67. http://dx.doi.org/10.24843/jmat.2017.v07.i01.p83.

Full text
Abstract:
Partial Least Square Regression (PLSR) is one of the methods applied in the estimation of multiple linear regression models when the ordinary least square method (OLS) can not be used. OLS generates an invalid model estimate when multicollinearity occurs or when the number of independent variables is greater than the number of data observations. In conditions that OLS can be applied in obtaining model estimation, want to know the performance of PLSR method. This study aims to determine the model of PLSR the influence of literacy rate, the average of school duration, school enrollment rate, Income per capita, and open unemployment rate to the level of poverty seen from the percentage of poor people in Indonesia by 2015. Estimated model with OLS , Only variable of literacy rate are included in the model with the coefficient of determination R2 = 32.52%. PLSR model estimation of cross-validation, leave-one-out method with one selected component has R2 of 33,23%. Both models shows a negative relationship between poverty and literacy rate. The higher literacy rate will reduce the poverty level, indicating that the success of the Indonesian government in the development of education will support the government's success in reducing poverty level.
APA, Harvard, Vancouver, ISO, and other styles
28

Haupt, Harry, Friedrich Lösel, and Mark Stemmler. "Quantile Regression Analysis and Other Alternatives to Ordinary Least Squares Regression." Methodology 10, no. 3 (2014): 81–91. http://dx.doi.org/10.1027/1614-2241/a000077.

Full text
Abstract:
Data analyses by classical ordinary least squares (OLS) regression techniques often employ unrealistic assumptions, fail to recognize the source and nature of heterogeneity, and are vulnerable to extreme observations. Therefore, this article compares robust and non-robust M-estimator regressions in a statistical demonstration study. Data from the Erlangen-Nuremberg Development and Prevention Project are used to model risk factors for physical punishment by fathers of 485 elementary school children. The Corporal Punishment Scale of the Alabama Parenting Questionnaire was the dependent variable. Fathers’ aggressiveness, dysfunctional parent-child relations, various other parenting characteristics, and socio-demographic variables served as predictors. Robustness diagnostics suggested the use of trimming procedures and outlier diagnostics suggested the use of robust estimators as an alternative to OLS. However, a quantile regression analysis provided more detailed insights beyond the measures of central tendency and detected sources of considerable heterogeneity in the risk structure of father’s corporal punishment. Advantages of this method are discussed with regard to methodological and content issues.
APA, Harvard, Vancouver, ISO, and other styles
29

Mahaboob, B., B. Venkateswarlu, C. Narayana, J. Ravi sankar, and P. Balasiddamuni. "A Treatise on Ordinary Least Squares Estimation of Parameters of Linear Model." International Journal of Engineering & Technology 7, no. 4.10 (2018): 518. http://dx.doi.org/10.14419/ijet.v7i4.10.21216.

Full text
Abstract:
This research article primarily focuses on the estimation of parameters of a linear regression model by the method of ordinary least squares and depicts Gauss-Mark off theorem for linear estimation which is useful to find the BLUE of a linear parametric function of the classical linear regression model. A proof of generalized Gauss-Mark off theorem for linear estimation has been presented in this memoir. Ordinary Least Squares (OLS) regression is one of the major techniques applied to analyse data and forms the basics of many other techniques, e.g. ANOVA and generalized linear models [1]. The use of this method can be extended with the use of dummy variable coding to include grouped explanatory variables [2] and data transformation models [3]. OLS regression is particularly powerful as it relatively easy to check the model assumption such as linearity, constant, variance and the effect of outliers using simple graphical methods [4]. J.T. Kilmer et.al [5] applied OLS method to evolutionary and studies of algometry.
APA, Harvard, Vancouver, ISO, and other styles
30

R, Aditya Setyawan, Mustika Hadijati, and Ni Wayan Switrayni. "Analisis Masalah Heteroskedastisitas Menggunakan Generalized Least Square dalam Analisis Regresi." EIGEN MATHEMATICS JOURNAL 1, no. 2 (2019): 61. http://dx.doi.org/10.29303/emj.v1i2.43.

Full text
Abstract:
Regression analysis is one statistical method that allows users to analyze the influence of one or more independent variables (X) on a dependent variable (Y).The most commonly used method for estimating linear regression parameters is Ordinary Least Square (OLS). But in reality, there is often a problem with heteroscedasticity, namely the variance of the error is not constant or variable for all values of the independent variable X. This results in the OLS method being less effective. To overcome this, a parameter estimation method can be used by adding weight to each parameter, namely the Generalized Least Square (GLS) method. This study aims to examine the use of the GLS method in overcoming heteroscedasticity in regression analysis and examine the comparison of estimation results using the OLS method with the GLS method in the case of heteroscedasticity.The results show that the GLS method was able to maintain the nature of the estimator that is not biased and consistent and able to overcome the problem of heteroscedasticity, so that the GLS method is more effective than the OLS method.
APA, Harvard, Vancouver, ISO, and other styles
31

Sadek, Amjed Mohammed, and Lekaa Ali Mohammed. "Evaluation of the Performance of Kernel Non-parametric Regression and Ordinary Least Squares Regression." JOIV : International Journal on Informatics Visualization 8, no. 3 (2024): 1352. http://dx.doi.org/10.62527/joiv.8.3.2430.

Full text
Abstract:
Researchers need to understand the differences between parametric and nonparametric regression models and how they work with available information about the relationship between response and explanatory variables and the distribution of random errors. This paper proposes a new nonparametric regression function for the kernel and employs it with the Nadaraya-Watson kernel estimator method and the Gaussian kernel function. The proposed kernel function (AMS) is then compared to the Gaussian kernel and the traditional parametric method, the ordinary least squares method (OLS). The objective of this study is to examine the effectiveness of nonparametric regression and identify the best-performing model when employing the Nadaraya-Watson kernel estimator method with the proposed kernel function (AMS), the Gaussian kernel, and the ordinary least squares (OLS) method. Additionally, it determines which method yields the most accurate results when analyzing nonparametric regression models and provides valuable insights for practitioners looking to apply these techniques in real-world scenarios. However, criteria such as generalized cross-validation (GCV), mean square error (MSE), and coefficient determination are used to select the most efficient estimated model. Simulated data was used to evaluate the performance and efficiency of estimators using different sample sizes. The results favorable the simulation illustrate that the Nadaraya-Watson kernel estimator using the proposed kernel function (AMS) exhibited favorable and superior performance compared to other methods. The coefficients of determination indicate that the highest values attained were 98%, 99%, and 99%. The proposed function (AMS) yielded the lowest MSE and GCV values across all samples. Therefore, this suggests that the model can generate precise predictions and enhance the performance of the focused data.
APA, Harvard, Vancouver, ISO, and other styles
32

Ogunmola, Adeniyi Oyewole, and Benjamin Ekene Okoye. "Application of Quantile Regression and Ordinary Least Squares Regression in Modeling Body Mass Index in Federal Medical Centre Jalingo, Nigeria." Journal of Multidisciplinary Science: MIKAILALSYS 3, no. 2 (2025): 552–58. https://doi.org/10.58578/mikailalsys.v3i2.5322.

Full text
Abstract:
Body mass index is a measure of nutritional status of an individual. Malnutrition is a leading public health problem in developing countries like Nigeria, it is also a major cause of morbidity and mortality. In this study, Body mass index is modeled using ordinary least squares method and quantile regression method. Data is collected from Antiretroviral therapy Clinic in Federal Medical Centre, Jalingo. Variables in the data collected are the Body mass index, age, weight, height, sex and occupation of the patients. Results showed that the ordinary least square regression and quantile regression at 25th percentile, median percentile, 75th percentile and 95th percentile fit the data. Weight, age, sex and height of patients are significant in determining the BMI of the patients when OLS method is applied. While weight, sex and height of patients are significant in determining the BMI of the patients. It is also discovered that OLS method fits the data more than quantile regression method using AIC and MSE.
APA, Harvard, Vancouver, ISO, and other styles
33

Chi, Wuchun, Huichi Huang, and Hong Xie. "A quantile regression analysis on corporate governance and the cost of bank loans: a research note." Review of Accounting and Finance 14, no. 1 (2015): 2–19. http://dx.doi.org/10.1108/raf-12-2012-0126.

Full text
Abstract:
Purpose – This paper aims to investigate whether there is heterogeneity in the relationship between the bank loan interest rate and its determinants using the quantile regression method and to reconcile some conflicting findings in prior literature. Design/methodology/approach – First, the effects of 18 determinants were examined on the bank loan interest rate using the ordinary least squares method (OLS). Second, it was investigated whether the relationship between the loan rate and its determinants is heterogeneous across quantiles of loan rates using the quantile regression method. Findings – Considerable heterogeneity was found in the relationship between the loan rate and its determinants. Specifically, a determinant that is beneficial for the bank loan rate, on average, as revealed by the OLS method may become unimportant or even detrimental for firms located at extremely high or low loan rate quantiles. By revealing extreme heterogeneity in the relationship between the loan rate and some of its determinants, the authors potentially explain two conflicting findings in prior literature. Originality/value – The conventional OLS method masks the heterogeneity in the relationship between the bank loan interest rate and its determinants. Quantile regression can be used to supplement the OLS estimates to gain a more detailed and complete picture of the relationship between the dependent variable and explanatory variables.
APA, Harvard, Vancouver, ISO, and other styles
34

Mehany, Taha, José M. González-Sáiz, and Consuelo Pizarro. "The Quality Prediction of Olive and Sunflower Oils Using NIR Spectroscopy and Chemometrics: A Sustainable Approach." Foods 14, no. 13 (2025): 2152. https://doi.org/10.3390/foods14132152.

Full text
Abstract:
This study presents a novel approach combining near-infrared (NIR) spectroscopy with multivariate calibration to develop simplified yet robust regression models for evaluating the quality of various edible oils. Using a reduced number of NIR wavelengths selected via the stepwise decorrelation method (SELECT) and ordinary least squares (OLS) regression, the models quantify pigments (carotenoids and chlorophyll), antioxidant activity, and key sensory attributes (rancid, fruity green, fruity ripe, bitter, and pungent) in nine extra virgin olive oil (EVOO) varieties. The dataset also includes low-quality olive oils (e.g., refined and pomace oils, supplemented or not with hydroxytyrosol) and sunflower oils, both before and after deep-frying. SELECT improves model performance by identifying key wavelengths—up to 30 out of 700—and achieves high correlation coefficients (R = 0.86–0.96) with low standard errors. The number of latent variables ranges from 26 to 30, demonstrating adaptability to different oil properties. The best models yield low leave-one-out (LOO) prediction errors, confirming their accuracy (e.g., 1.36 mg/kg for carotenoids and 0.88 for rancidity). These results demonstrate that SELECT–OLS regression combined with NIR spectroscopy provides a fast, cost-effective, and reliable method for assessing oil quality under diverse processing conditions, including deep-frying, making it highly suitable for quality control in the edible oils industry.
APA, Harvard, Vancouver, ISO, and other styles
35

Bastiaan, Richy Marcelino, Deiby Tineke Salaki, and Djoni Hatidja. "Comparing the Performance of Prediction Model of Ridge and Elastic Net in Correlated Dataset." Operations Research: International Conference Series 3, no. 1 (2022): 8–13. http://dx.doi.org/10.47194/orics.v3i1.127.

Full text
Abstract:
Multicollinearity refers to a condition where high correlation between independent variables in linear regression model occurs. In this case, using ordinary least squares (OLS) leads to unstable model. Some penalized regression approaches such as ridge and elastic-net regression can be applied to overcome the problem. Penalized regression estimates model by adding a constrain on the size of parameter regression. In this study, simulation dataset is generated, comprised of 100 observation and 95 independent variables with high correlation. This empirical study shows that elastic-net method outperforms the ridge regression and OLS. In correlated dataset, the OLS is failed to produce a prediction model based on mean squared error (MSE)
APA, Harvard, Vancouver, ISO, and other styles
36

Herawati, Netti. "EFEKTIVITAS REGRESI KUANTIL DALAM MENGATASI PONTENSIAL PENCILAN." BAREKENG: Jurnal Ilmu Matematika dan Terapan 14, no. 2 (2020): 301–8. https://doi.org/10.30598/barekengvol14iss2pp301-308.

Full text
Abstract:
Quantile regression as a robust regression method can be used to overcome the impact of unusual cases on regression estimates such as the presence of potential outliers in the data. The purpose of this study was to evaluate the effectiveness of quantile regression in dealing with potential outliers in multiple linear regression compared to ordinary least square (OLS). This study used simulation data in multiple regression model with the number of independent variables (p=3) for different sample sizes (n = 20, 40, 60, 100, 200) and and repeated 1000 times. The effectiveness of the quantile regression method and OLS in estimating β parameters was measured by Mean square error (MSE) and the best model is chosen based on the smallest Akaike Information Criterion (AIC) value. The results showed that in contrast to OLS, quantile regression was able to deal with potential outliers and provided a better estimator with a smaller mean mean square error. Compared to OLS and other quantiles, this study also provides sufficient results that quantile 0.5 provides the best parameter estimate and the best model based on the smallest MSE and AIC values.
APA, Harvard, Vancouver, ISO, and other styles
37

NURLAILA, ZAKIAH, MADE SUSILAWATI, and DESAK PUTU EKA NILAKUSMAWATI. "PENERAPAN METODE NEWEY WEST DALAM MENGOREKSI STANDARD ERROR KETIKA TERJADI HETEROSKEDASTISITAS DAN AUTOKORELASI PADA ANALISIS REGRESI." E-Jurnal Matematika 6, no. 1 (2017): 7. http://dx.doi.org/10.24843/mtk.2017.v06.i01.p142.

Full text
Abstract:
Ordinary Least Squares (OLS) used to estimate the parameters in the regression analysis. If one of the assumptions is not fulfilled, the results of the OLS are no longer best, linear, and unbiased properties. The aim of this research was to find out the application of Newey West method to correct standard error when heteroscedasticity and autocorrelation occurred, and to compare the results of OLS with Newey West method on secondary and simulation data. OLS can still be used to estimate the regression parameter when heteroscedasticity and autocorrelation occurred. However, it will cause bias on standard error of parameter. A method which can correct the standard error of parameters to be unbiased parameter is Newey West method. The secondary data about Passenger Car Milage and data simulated contain heteroscedasticity and autocorrelation. The analysis showed that the Newey West method were known is able to correct standard error when heteroscedasticity and autocorrelation occurred on both of data. It was obtained that Newey west method with and changes the value of the bias standard error of OLS to be unbiased.
APA, Harvard, Vancouver, ISO, and other styles
38

Basalamah, Salsabila, and Edy Widodo. "Response Surface Model with Comparison of OLS Estimation and MM Estimation." Indonesian Journal of Statistics and Its Applications 5, no. 2 (2021): 273–83. http://dx.doi.org/10.29244/ijsa.v5i2p273-283.

Full text
Abstract:
Response Surface Method (RSM) is a collection of statistical techniques in the form of experiments and regression, as well as mathematics that is useful for developing, improving, and optimizing processes. In general, the determination of models in RSM is estimated by linear regression with Ordinary Least Square (OLS) estimation. However, OLS estimation is very weak in the presence of data identified as outliers, so in determining the RSM model a strong and resistant estimation is needed namely robust regression. One estimation method in robust regression is the Method of Moment (MM) estimation. This study aims to compare the OLS estimation and MM estimation method to get the optimal point of response in this case study. Comparison of the best estimation models using the parameters MSE and R^2 adj. The results of MM estimation give better results to the optimal response results in this case study.
APA, Harvard, Vancouver, ISO, and other styles
39

Ajeel, Sherzad M., and Hussein A. Hashem. "Comparison Some Robust Regularization Methods in Linear Regression via Simulation Study." Academic Journal of Nawroz University 9, no. 2 (2020): 244. http://dx.doi.org/10.25007/ajnu.v9n2a818.

Full text
Abstract:
In this paper, we reviewed some variable selection methods in linear regression model. Conventional methodologies such as the Ordinary Least Squares (OLS) technique is one of the most commonly used method in estimating the parameters in linear regression. But the OLS estimates performs poorly when the dataset suffer from outliers or when the assumption of normality is violated such as in the case of heavy-tailed errors. To address this problem, robust regularized regression methods like Huber Lasso (Rosset and Zhu, 2007) and quantile regression (Koenker and Bassett ,1978] were proposed. This paper focuses on comparing the performance of the seven methods, the quantile regression estimates, the Huber Lasso estimates, the adaptive Huber Lasso estimates, the adaptive LAD Lasso, the Gamma-divergence estimates, the Maximum Tangent Likelihood Lasso (MTE) estimates and Semismooth Newton Coordinate Descent Algorithm (SNCD ) Huber loss estimates.
APA, Harvard, Vancouver, ISO, and other styles
40

Destiyani, Eka, Rita Rahmawati, and Suparti Suparti. "PEMODELAN REGRESI RIDGE ROBUST-MM DALAM PENANGANAN MULTIKOLINIERITAS DAN PENCILAN (Studi Kasus : Faktor-Faktor yang Mempengaruhi AKB di Jawa Tengah Tahun 2017)." Jurnal Gaussian 8, no. 1 (2019): 24–34. http://dx.doi.org/10.14710/j.gauss.v8i1.26619.

Full text
Abstract:
The Ordinary Least Squares (OLS) is one of the most commonly used method to estimate linear regression parameters. If multicollinearity is exist within predictor variables especially coupled with the outliers, then regression analysis with OLS is no longer used. One method that can be used to solve a multicollinearity and outliers problems is Ridge Robust-MM Regression. Ridge Robust-MM Regression is a modification of the Ridge Regression method based on the MM-estimator of Robust Regression. The case study in this research is AKB in Central Java 2017 influenced by population dencity, the precentage of households behaving in a clean and healthy life, the number of low-weighted baby born, the number of babies who are given exclusive breastfeeding, the number of babies that receiving a neonatal visit once, and the number of babies who get health services. The result of estimation using OLS show that there is violation of multicollinearity and also the presence of outliers. Applied ridge robust-MM regression to case study proves ridge robust regression can improve parameter estimation. Based on t test at 5% significance level most of predictor variables have significant effect to variable AKB. The influence value of predictor variables to AKB is 47.68% and MSE value is 0.01538.Keywords: Ordinary Least Squares (OLS), Multicollinearity, Outliers, RidgeRegression, Robust Regression, AKB.
APA, Harvard, Vancouver, ISO, and other styles
41

Ahmad, Imran, Mithas Ahmad Dar, Assefa Fenta, et al. "Spatial configuration of groundwater potential zones using OLS regression method." Journal of African Earth Sciences 177 (May 2021): 104147. http://dx.doi.org/10.1016/j.jafrearsci.2021.104147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Washington, Simon, and Jean Wolf. "Hierarchical Tree-Based Versus Ordinary Least Squares Linear Regression Models: Theory and Example Applied to Trip Generation." Transportation Research Record: Journal of the Transportation Research Board 1581, no. 1 (1997): 82–88. http://dx.doi.org/10.3141/1581-11.

Full text
Abstract:
Given the continual need for transportation professionals to forecast trends and the increasing availability of sophisticated and improved modeling methods in new and improved software packages, new methods should be explored to determine whether they can replace or supplement more classical statistical methods. One commonly used classical statistical technique for relating a continuous dependent variable with one or more independent variables (continuous or discrete) is ordinary least squares (OLS) regression. This method is routinely applied in transportation to forecast such things as energy use, trip attractions, trip productions, automobile emissions, and growth in vehicle miles traveled (VMT). Despite its widespread use and tremendous utility, however, OLS regression has limitations. It does not deal well with multicollinear independent variables, interactions between independent variables must be specified, the functional relationship between dependent and independent variables must be known (or approximated well), it cannot handle missing data well, and it does not treat satisfactorily discrete variables with more than two levels. Hierarchical tree-based regression (HTBR) may provide a better model for forecasting continuous response variables in transportation applications when the shortcomings of OLS regression are present. The theory of HTBR methods is presented. Then, an example using trip generation data is used to illustrate the types of models that result from OLS regression and HTBR methods. Finally, the limitations of HTBR are presented.
APA, Harvard, Vancouver, ISO, and other styles
43

ASTARI, NI MADE METTA, NI LUH PUTU SUCIPTAWATI, and I. KOMANG GDE SUKARSA. "PENERAPAN METODE BOOTSTRAP RESIDUAL DALAM MENGATASI BIAS PADA PENDUGA PARAMETER ANALISIS REGRESI." E-Jurnal Matematika 3, no. 4 (2014): 130. http://dx.doi.org/10.24843/mtk.2014.v03.i04.p075.

Full text
Abstract:
Statistical analysis which aims to analyze a linear relationship between the independent variable and the dependent variable is known as regression analysis. To estimate parameters in a regression analysis method commonly used is the Ordinary Least Square (OLS). But the assumption is often violated in the OLS, the assumption of normality due to one outlier. As a result of the presence of outliers is parameter estimators produced by the OLS will be biased. Bootstrap Residual is a bootstrap method that is applied to the residual resampling process. The results showed that the residual bootstrap method is only able to overcome the bias on the number of outliers 5% with 99% confidence intervals. The resulting parameters estimators approach the residual bootstrap values ??OLS initial allegations were also able to show that the bootstrap is an accurate prediction tool.
APA, Harvard, Vancouver, ISO, and other styles
44

Pangesti, Anggun Perdana Aji, Sugito Sugito, and Hasbi Yasin. "PEMODELAN REGRESI RIDGE ROBUST S,M, MM-ESTIMATOR DALAM PENANGANAN MULTIKOLINIERITAS DAN PENCILAN (Studi Kasus : Faktor-Faktor yang Mempengaruhi Kemiskinan di Jawa Tengah Tahun 2020)." Jurnal Gaussian 10, no. 3 (2021): 402–12. http://dx.doi.org/10.14710/j.gauss.v10i3.32799.

Full text
Abstract:
The Ordinary Least Squares (OLS) is one of the most commonly used method to estimate linier regression parameters. If there is a violation of assumptions such as multicolliniearity especially coupled with the outliers, then the regression with OLS is no longer used. One method can be used to solved the multicollinearity and outliers problem is Ridge Robust Regression. Ridge Robust Regression is a modification of ridge regression method used to solve the multicolliniearity and using some estimators of robust regression used to solve the outlier, the estimator including : Maximum likelihood estimator (M-estimator), Scale estimator (S-estimator), and Method of moment estimator (MM-estimator). The case study can be used with this method is data with multicollinearity and outlier, the case study in this research is poverty in Central Java 2020 influenced by life expentancy, unemployment number, GRDP rate, dependency ratio, human development index, the precentage of population over 15 years of age with the highest education in primary school, mean years school. The result of estimation using OLS show that there is a multicollinearity and presence an outliers. Applied the ridge robust regression to case study prove that ridge robust regression can improve parameter estimation. The best ridge robust regression model is Ridge Robust Regression S-Estimator. The influence value of predictor variabels to poverty is 73,08% and the MSE value is 0,00791.
APA, Harvard, Vancouver, ISO, and other styles
45

Haraguș, Ramona-Ionela. "USE OF THE REGRESSION METHOD IN IDENTIFYING THE CAUSAL LINK AND THE INTERFERENCES BETWEEN ACCOUNTING-FISCAL-AUDIT." Journal of Financial Studies 9, Special (2024): 81–95. http://dx.doi.org/10.55654/jfs.2024.9.sp.06.

Full text
Abstract:
The fiscal pressure generated by the current economic connection, the countless legislative changes and their rapid evolution highlight the importance of one of the fundamental principles of accounting, namely the going concern hypothesis, an increasing number of companies, facing foresight difficulties, being unable to say for certain that these conditions can ensure a normal performance of their activity, and under these conditions, the attention, professional reasoning and responsibility of the auditors are extremely important. For the purpose of identifying the links between accounting, taxation and audit, respectively to assess the impact of determinants, fiscal pressure, indebtedness degree, auditor type, key audit aspects, turnover on business continuity, and, for ,,top traded” to BSE entities in the period 2018-2022, two econometric models were proposed, namely a multiple OLS linear regression and an OLS regression with the option Polled OLS. The study found that there is a significant link between the variables, the change in the dependent variable being influenced by 19% of the change in the independent variables, all hypotheses are confirmed by the results obtained.
APA, Harvard, Vancouver, ISO, and other styles
46

Lin, Xue, Yung-Chih Su, Jiali Shang, et al. "Geographically Weighted Regression Effects on Soil Zinc Content Hyperspectral Modeling by Applying the Fractional-Order Differential." Remote Sensing 11, no. 6 (2019): 636. http://dx.doi.org/10.3390/rs11060636.

Full text
Abstract:
With the development of remote sensing techniques and the increasing need for soil contamination monitoring, we estimated soil heavy metal zinc (Zn) content using hyperspectral imaging. Geographically weighted regression (GWR), an extension of the ordinary least squares (OLS) regression framework, was proposed. By estimating a set of parameters for any number of locations in a study area, GWR can probe the spatial heterogeneity in data relationships, whereas the regression parameters of an OLS model are global and aspatially-varied. The objectives of this study were: (1) To find the possible relationships between hyperspectral data and soil Zn content, and (2) to investigate the existence of their spatial heterogeneity. In this study, 67 soil samples collected from Pingtan Island, Fujian Province, China, were used to conduct laboratory hyperspectral modeling for soil Zn content estimation. Four transformations of square root, logarithm, reciprocal of logarithm, and reciprocal, as well as the fractional-order differential operations were applied to increase the amount of reflectance data in which the effective variables for modeling might be involved, and to enhance the spectral characteristics of soil Zn content. To find sensitive variables and to remove redundancy and multicollinearity in the spectra, a data sifting process was applied by selecting wavelengths with local maximum in the absolute values of the correlation coefficients with Zn content in one type of spectral data and by employing Variance Inflation Factors. Since a modeling sample size of 46 is insufficient to construct the appropriate OLS and GWR models, four methods are proposed using all 67 samples to choose explanatory variables. A random process to select 57 samples for modeling and 10 samples for validation was applied to assess model performance, in which the mean verification R2 (Rv2) was used as an indicator. The results show that GWR stepwise regression is the most effective method to select better variables. As the mean Rv2 converges toward the OLS value when the bandwidth of the GWR model increases, the four variables selected by the GWR stepwise regression were used to establish the representative OLS and GWR models. The representative OLS model has the best mean verification effect among all studied models, which had a mean Rv2 value that is 44.6% higher than the OLS model constructed using OLS stepwise regression.
APA, Harvard, Vancouver, ISO, and other styles
47

Aladeitan, BENEDICTA, Adewale F. Lukman, Esther Davids, Ebele H. Oranye, and Golam B. M. Kibria. "Unbiased K-L estimator for the linear regression model." F1000Research 10 (August 19, 2021): 832. http://dx.doi.org/10.12688/f1000research.54990.1.

Full text
Abstract:
Background: In the linear regression model, the ordinary least square (OLS) estimator performance drops when multicollinearity is present. According to the Gauss-Markov theorem, the estimator remains unbiased when there is multicollinearity, but the variance of its regression estimates become inflated. Estimators such as the ridge regression estimator and the K-L estimators were adopted as substitutes to the OLS estimator to overcome the problem of multicollinearity in the linear regression model. However, the estimators are biased, though they possess a smaller mean squared error when compared to the OLS estimator. Methods: In this study, we developed a new unbiased estimator using the K-L estimator and compared its performance with some existing estimators theoretically, simulation wise and by adopting real-life data. Results: Theoretically, the estimator even though unbiased also possesses a minimum variance when compared with other estimators. Results from simulation and real-life study showed that the new estimator produced smaller mean square error (MSE) and had the smallest mean square prediction error (MSPE). This further strengthened the findings of the theoretical comparison using both the MSE and the MSPE as criterion. Conclusions: By simulation and using a real-life application that focuses on modelling, the high heating values of proximate analysis was conducted to support the theoretical findings. This new method of estimation is recommended for parameter estimation with and without multicollinearity in a linear regression model.
APA, Harvard, Vancouver, ISO, and other styles
48

Tusilowati, Tusilowati, L. Handayani, and Rais Rais. "SIMULASI PENANGANAN PENCILAN PADA ANALISIS REGRESI MENGGUNAKAN METODE LEAST MEDIAN SQUARE (LMS)." JURNAL ILMIAH MATEMATIKA DAN TERAPAN 15, no. 2 (2018): 238–47. http://dx.doi.org/10.22487/2540766x.2018.v15.i2.11362.

Full text
Abstract:
The simulation of handling of outliers on regression analysis used the method which was commonly used to predict the parameter in regression analysis, namely Least Median Square (LMS) due to the simple calculation it had. The data with outliers would result in unbiased parameter estimate. Hence, it was necessary to draw up the robust regression to overcome the outliers. The data used were simulation data of the number of data pairs ( X,Y) by 25 and 100 respectively. The result of the simulation was divided into 5 subsets of data cluster of parameter regression prediction by Ordinary Least Square (OLS) and Least Median Square (LMS) methods. The prediction result of the parameter of each method on each subset of data cluster was tested with both method to discover the which better one. Based on the research findings, it was found that The Least Median Square (LMS) method was known better than Ordinary Least Square (OLS) method in predicting the regression parameter on the data which had up to 3% of the percentage of the outlier.
APA, Harvard, Vancouver, ISO, and other styles
49

Matdoan, Yahya. "Modeling of Quantile Regression to Know the Factors Affecting the High Spread Api Malaria in Indonesia." Jurnal Matematika, Statistika dan Komputasi 16, no. 3 (2020): 417. http://dx.doi.org/10.20956/jmsk.v16i3.8970.

Full text
Abstract:
The OLS method estimation is based on a normal distribution, so it is not appropriate to analyze a number of data that are not symmetrical or contain outliers. Therefore, quantile regression was developed which was not affected by outliers. This study compares quantile regression with OLS in the case of factors affecting malaria in Indonesia. The results show that the value of the Quantil Regression model is 0,832 and the MSE value is 0,182. In addition, the OLS model obtained a value of 0,681 and an MSE value of 0,231. So we get the conclusion that the best model is a quantile regression model. Further results were obtained that the main factors causing the spread of malaria in Indonesia were the factor of livable houses, poor population factors and physician factors.
APA, Harvard, Vancouver, ISO, and other styles
50

Račkauskas, Alfredas, and Danas Zuokas. "Properties of the coefficient estimators for the linear regression model with heteroskedastic error term." Lietuvos matematikos rinkinys 46 (September 21, 2023): 267–72. http://dx.doi.org/10.15388/lmr.2006.30725.

Full text
Abstract:
In this paper we present estimated generalized least squares (EGLS) estimator for the coefficient vector β in the linear regression model y = βX + ε, where disturbance term can be heteroskedastic. For the heteroskedasticity of the changed segment type, using Monte-Carlo method, we investigate empirical properties of the proposed and ordinary least squares (OLS) estimators. The results show that the empirical covariance of the EGLS estimators is smaller than that of OLS estimators.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography