Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Least Square Regression Method.

Статті в журналах з теми "Least Square Regression Method"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Least Square Regression Method".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Yeniay, Özgür, Öznur İşçi, Atilla Göktaş, and M. Niyazi Çankaya. "Time Scale in Least Square Method." Abstract and Applied Analysis 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/354237.

Повний текст джерела
Анотація:
Study of dynamic equations in time scale is a new area in mathematics. Time scale tries to build a bridge between real numbers and integers. Two derivatives in time scale have been introduced and called as delta and nabla derivative. Delta derivative concept is defined as forward direction, and nabla derivative concept is defined as backward direction. Within the scope of this study, we consider the method of obtaining parameters of regression equation of integer values through time scale. Therefore, we implemented least squares method according to derivative definition of time scale and obtained coefficients related to the model. Here, there exist two coefficients originating from forward and backward jump operators relevant to the same model, which are different from each other. Occurrence of such a situation is equal to total number of values of vertical deviation between regression equations and observation values of forward and backward jump operators divided by two. We also estimated coefficients for the model using ordinary least squares method. As a result, we made an introduction to least squares method on time scale. We think that time scale theory would be a new vision in least square especially when assumptions of linear regression are violated.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Frank, Ildiko E. "Intermediate least squares regression method." Chemometrics and Intelligent Laboratory Systems 1, no. 3 (July 1987): 233–42. http://dx.doi.org/10.1016/0169-7439(87)80067-9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Abdi, Hamdan, Sajaratud Dur, Rina Widyasar, and Ismail Husein. "Analysis of Efficiency of Least Trimmed Square and Least Median Square Methods in The Estimation of Robust Regression Parameters." ZERO: Jurnal Sains, Matematika dan Terapan 4, no. 1 (August 16, 2020): 21. http://dx.doi.org/10.30829/zero.v4i1.7933.

Повний текст джерела
Анотація:
<span lang="EN">Robust regression is a regression method used when the remainder's distribution is not reasonable, or there is an outreach to observational data that affects the model. One method for estimating regression parameters is the Least Squares Method (MKT). The method is easily affected by the presence of outliers. Therefore we need an alternative method that is robust to the presence of outliers, namely robust regression. Methods for estimating robust regression parameters include Least Trimmed Square (LTS) and Least Median Square (LMS). These methods are estimators with high breakdown points for outlier observational data and have more efficient algorithms than other estimation methods. This study aims to compare the regression models formed from the LTS and LMS methods, determine the efficiency of the model formed, and determine the factors that influence the production of community oil palm in Langkat District in 2018. The results showed that in testing, the estimated model of the regression parameters showed the same results. Compared to the efficiency estimator and the error square value, it was concluded that the LTS method was more efficient. Variable land area and productivity influence the production of palm oil smallholders in Langkat District in 2018. as well as the comparison of the efficiency estimator and the error square value, it was concluded that the LTS method was more efficient. Variable land area and productivity are factors that influence the production of palm oil smallholders in Langkat District in 2018. as well as the comparison of the efficiency estimator and the error square value, it was concluded that the LTS method was more efficient. Variable land area and productivity are factors that influence the production of palm oil smallholders in Langkat District in 2018</span>
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Yu, Yan Hua, Li Xia Song, and Kun Lun Zhang. "Fuzzy C-Regression Models." Applied Mechanics and Materials 278-280 (January 2013): 1323–26. http://dx.doi.org/10.4028/www.scientific.net/amm.278-280.1323.

Повний текст джерела
Анотація:
Fuzzy linear regression has been extensively studied since its inception symbolized by the work of Tanaka et al. in 1982. As one of the main estimation methods, fuzzy least squares approach is appealing because it corresponds, to some extent, to the well known statistical regression analysis. In this article, a restricted least squares method is proposed to fit fuzzy linear models with crisp inputs and symmetric fuzzy output. The paper puts forward a kind of fuzzy linear regression model based on structured element, This model has precise input data and fuzzy output data, Gives the regression coefficient and the fuzzy degree function determination method by using the least square method, studies the imitation degree question between the observed value and the forecast value.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Zhu, Xue Jun, and Zhi Wen Zhu. "Modeling of Piezoelectric Ceramics Based on Partial Least-Square Regression Method." Advanced Materials Research 139-141 (October 2010): 13–16. http://dx.doi.org/10.4028/www.scientific.net/amr.139-141.13.

Повний текст джерела
Анотація:
In this paper, a kind of piezoelectric ceramics model based on hysteretic nonlinear theory has been developed. Van de Pol nonlinear difference item was introduced to interpret the hysteresis phenomenon of the voltage-strain curve of piezoelectric ceramics. The coupling relationship between voltage and stress was obtained in partial least-square regression method to describe the driftage phenomenon of the voltage-strain curve in different stress. Based on above, the final relationship among strain, stress and voltage was set up. The results of significance test showed that the new model could describe the hysteresis characteristics of piezoelectric ceramics in different stress well. The new piezoelectric ceramics model considers the effect of stress, and is easy to be analyzed in theory, which is helpful to vibration control.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Shankar, S. Vishnu, G. Padmalakshmi, and M. Radha. "Estimation and Comparison of Support Vector Regression with Least Square Method." International Journal of Current Microbiology and Applied Sciences 8, no. 02 (February 10, 2019): 1186–91. http://dx.doi.org/10.20546/ijcmas.2019.802.137.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Dimovski, Marko. "REGULARIZED LEAST-SQUARE OPTIMIZATION METHOD FOR VARIABLE SELECTION IN REGRESSION MODELS." Математички билтен/BULLETIN MATHÉMATIQUE DE LA SOCIÉTÉ DES MATHÉMATICIENS DE LA RÉPUBLIQUE MACÉDOINE, no. 1 (2017): 80–100. http://dx.doi.org/10.37560/matbil11700080d.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Su, Moting, Zongyi Zhang, Ye Zhu, and Donglan Zha. "Data-Driven Natural Gas Spot Price Forecasting with Least Squares Regression Boosting Algorithm." Energies 12, no. 6 (March 21, 2019): 1094. http://dx.doi.org/10.3390/en12061094.

Повний текст джерела
Анотація:
Natural gas is often described as the cleanest fossil fuel. The consumption of natural gas is increasing rapidly. Accurate prediction of natural gas spot prices would significantly benefit energy management, economic development, and environmental conservation. In this study, the least squares regression boosting (LSBoost) algorithm was used for forecasting natural gas spot prices. LSBoost can fit regression ensembles well by minimizing the mean squared error. Henry Hub natural gas spot prices were investigated, and a wide range of time series from January 2001 to December 2017 was selected. The LSBoost method is adopted to analyze data series at daily, weekly and monthly. An empirical study verified that the proposed prediction model has a high degree of fitting. Compared with some existing approaches such as linear regression, linear support vector machine (SVM), quadratic SVM, and cubic SVM, the proposed LSBoost-based model showed better performance such as a higher R-square and lower mean absolute error, mean square error, and root-mean-square error.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Nurbaroqah, Ana, Budi Pratikno, and Supriyanto Supriyanto. "PENDEKATAN REGRESI ROBUST DENGAN FUNGSI PEMBOBOT BISQUARE TUKEY PADA ESTIMASI-M DAN ESTIMASI-S." Jurnal Ilmiah Matematika dan Pendidikan Matematika 14, no. 1 (June 30, 2022): 19. http://dx.doi.org/10.20884/1.jmp.2022.14.1.5669.

Повний текст джерела
Анотація:
Least Square Method is one of methods for estimating of parameters of regression model. Model of least square methods is not valid if there are some disobeydiance in classical assumptions, for example, there are outliers. To resolve the problem, robust regression method is usually used. The method is used because it can detect the outliers and give stable results. In this research, data used is data for human development index of districts in Central Java from 2019 to 2020. Estimation for robust regression method chosen is estimation-M and estimation-s with Tukey Bisquare as a weight function. Criterions for choosing the best model are based on adjusted R-Squared value and mean square error (MSE). The result shows that robust regression model with estimation-M is a better model since it has adjusted R-Squared value tending to one and the least MSE.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Choi, Seung Hoe, and Jin Hee Yoon. "General fuzzy regression using least squares method." International Journal of Systems Science 41, no. 5 (May 2010): 477–85. http://dx.doi.org/10.1080/00207720902774813.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

SUNDARI, NI WAYAN ARI, I. GUSTI AYU MADE SRINADI, and MADE SUSILAWATI. "PENERAPAN METODE PARTIAL LEAST SQUARE REGRESSION (PLSR) PADA KASUS SKIZOFRENIA." E-Jurnal Matematika 10, no. 2 (May 30, 2021): 137. http://dx.doi.org/10.24843/mtk.2021.v10.i02.p333.

Повний текст джерела
Анотація:
Partial Least Square Regression (PLSR) is a method that combines principal component analysis and multiple linear regression, which aims to predict or analyze the dependent variable and more than one independent variable. The purpose of this study is to determine the equation model for the recurrence of schizophrenia patients using the PLSR method. The best number of components to form a PLSR model in this study is one component with a minimum RMSEP value of 0.6094 and an adjR2 value of 80.09 percent.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Tusilowati, Tusilowati, L. Handayani, and Rais Rais. "SIMULASI PENANGANAN PENCILAN PADA ANALISIS REGRESI MENGGUNAKAN METODE LEAST MEDIAN SQUARE (LMS)." JURNAL ILMIAH MATEMATIKA DAN TERAPAN 15, no. 2 (December 6, 2018): 238–47. http://dx.doi.org/10.22487/2540766x.2018.v15.i2.11362.

Повний текст джерела
Анотація:
The simulation of handling of outliers on regression analysis used the method which was commonly used to predict the parameter in regression analysis, namely Least Median Square (LMS) due to the simple calculation it had. The data with outliers would result in unbiased parameter estimate. Hence, it was necessary to draw up the robust regression to overcome the outliers. The data used were simulation data of the number of data pairs ( X,Y) by 25 and 100 respectively. The result of the simulation was divided into 5 subsets of data cluster of parameter regression prediction by Ordinary Least Square (OLS) and Least Median Square (LMS) methods. The prediction result of the parameter of each method on each subset of data cluster was tested with both method to discover the which better one. Based on the research findings, it was found that The Least Median Square (LMS) method was known better than Ordinary Least Square (OLS) method in predicting the regression parameter on the data which had up to 3% of the percentage of the outlier.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Tomioka, Satoshi, Shusuke Nisiyama, and Takeaki Enoto. "Nonlinear Least Square Regression by Adaptive Domain Method With Multiple Genetic Algorithms." IEEE Transactions on Evolutionary Computation 11, no. 1 (February 2007): 1–16. http://dx.doi.org/10.1109/tevc.2006.876363.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Zheng, Wenming, Minghai Xin, Xiaolan Wang, and Bei Wang. "A Novel Speech Emotion Recognition Method via Incomplete Sparse Least Square Regression." IEEE Signal Processing Letters 21, no. 5 (May 2014): 569–72. http://dx.doi.org/10.1109/lsp.2014.2308954.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

M, Anila, and G. Pradeepini. "Least Square Regression for Prediction Problems in Machine Learning using R." International Journal of Engineering & Technology 7, no. 3.12 (July 20, 2018): 960. http://dx.doi.org/10.14419/ijet.v7i3.12.17612.

Повний текст джерела
Анотація:
The most commonly used prediction technique is Ordinary Least Squares Regression (OLS Regression). It has been applied in many fields like statistics, finance, medicine, psychology and economics. Many people, specially Data Scientists using this technique know that it has not gone with enough training to apply it and should be checked why & when it can or can’t be applied.It’s not easy task to find or explain about why least square regression [1] is faced much criticism when trained and tried to apply it. In this paper, we mention firstly about fundamentals of linear regression and OLS regression along with that popularity of LS method, we present our analysis of difficulties & pitfalls that arise while OLS method is applied, finally some techniques for overcoming these problems.
Стилі APA, Harvard, Vancouver, ISO та ін.
16

McArdle, B. H. "The structural relationship: regression in biology." Canadian Journal of Zoology 66, no. 11 (November 1, 1988): 2329–39. http://dx.doi.org/10.1139/z88-348.

Повний текст джерела
Анотація:
Most biologists are now aware that ordinary least square regression is not appropriate when the X and Y variables are both subject to random error. When there is no information about their error variances, there is no correct unbiased solution. Although the major axis and reduced major axis (geometric mean) methods are widely recommended for this situation, they make different, equally restrictive assumptions about the error variances. By using simulated data sets that violate these assumptions, the reduced major axis method is shown to be generally more efficient and less biased than the major axis method. It is concluded that if the error rate of the X variable is thought to be more than a third of that on the Y variable, then the reduced major axis method is preferable; otherwise the least squares technique is acceptable. An analogous technique, the standard minor axis method, is described for use in place of least squares multiple regression when all of the variables are subject to error.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Challa, Ratna Kumari, Siva Prasad Chintha, B. Reddaiah, and Kanusu Srinivasa Rao. "A Novel Fast Searching Algorithm Based on Least Square Regression." Revue d'Intelligence Artificielle 35, no. 1 (February 28, 2021): 93–98. http://dx.doi.org/10.18280/ria.350111.

Повний текст джерела
Анотація:
Currently, the machine learning group is well-understood and commonly used for predictive modelling and feature generation through linear methodologies such as reversals, principal analysis and canonical correlation analyses. All these approaches are typically intended to capture fascinating subspaces in the original space of high dimensions. These methods have all a closed-form approach because of its simple linear structures, which makes estimation and theoretical analysis for small datasets very straightforward. However, it is very common for a data set to have millions or trillions of samples and features in modern machine learning problems. We deal with the problem of fast estimation from large volumes of data for ordinary squares. The search operation is a very important operation and it is useful in many applications. Some applications when the data set size is large, the linear search takes the time which is proportional to the size of the data set. Binary search and interpolation search performs good for the search of elements in the data set in O(logn) and ⋅O(log(⋅logn)) respectively in the worst case. Now, in this paper, an effort is made to develop a novel fast searching algorithm based on the least square regression curve fitting method. The algorithm is implemented and its execution results are analyzed and compared with binary search and interpolation search performance. The proposed model is compared with the traditional methods and the proposed fast searching algorithm exhibits better performance than the traditional models.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Othman, Nariman Yahya, Zahra Abd Saleh, and Zainab Ali Omran. "Development of Stage – Distance – Discharge Relationship and Rating Curve using Least Square Method." Civil Engineering Journal 5, no. 9 (September 22, 2019): 1959–69. http://dx.doi.org/10.28991/cej-2019-03091385.

Повний текст джерела
Анотація:
For any river, besides the importance of stage – discharge relationship (rating curve), a stage-discharge- distance relationship is of more significance. The accurate estimation of both relationships along a river reach is considered a key point for various applications of water resources engineering such as operation and management of water resources projects, designing of hydraulic structures, and sediment analysis. In this paper, both relationships were established for the Shatt Al – Hillah river reach by applying multiple linear regression and simple linear regression using least square method for determining regression equations. Twelve gauging stations including three primary and nine secondary stations were considered for this method. Moreover, for evaluating the performance of both regressions, statistical measures such as coefficient of determination, root mean square error, mean square error, and Thiel's factor were used. The study results generally indicate a superior performance of both modeling techniques. MLR model was able to predict and mimic the stage-discharge-distance relationship with a coefficient correlation of about 0.932, while SLR model was able to predict three rating curves for the three primary stations with coefficient correlation of about 0.960, 0.943, and 0.924 respectively.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Samosir, Ravika Dewi, Deiby Tineke Salaki, and Yohanes Langi. "Comparison of Partial Least Squares Regression and Principal Component Regression for Overcoming Multicollinearity in Human Development Index Model." Operations Research: International Conference Series 3, no. 1 (March 5, 2022): 1–7. http://dx.doi.org/10.47194/orics.v3i1.126.

Повний текст джерела
Анотація:
One of the assumptions in ordinary least squares (OLS) in estimating regression parameter is lack of multicollinearity. If the multicollinearity exists, Partial Least Square (PLS) and Principal Component Regression (PCR) can be used as alternative approaches to solve the problem. This research intends to compare those methods in modeling factors that influence the Human Development Index (HDI) of North Sumatra Province in 2019 obtained from the Central Bureau of Statistics. The result indicates that the PLS outperforms the PCR in term of the coefficient of determination and squared error
Стилі APA, Harvard, Vancouver, ISO та ін.
20

R, Aditya Setyawan, Mustika Hadijati, and Ni Wayan Switrayni. "Analisis Masalah Heteroskedastisitas Menggunakan Generalized Least Square dalam Analisis Regresi." EIGEN MATHEMATICS JOURNAL 1, no. 2 (December 31, 2019): 61. http://dx.doi.org/10.29303/emj.v1i2.43.

Повний текст джерела
Анотація:
Regression analysis is one statistical method that allows users to analyze the influence of one or more independent variables (X) on a dependent variable (Y).The most commonly used method for estimating linear regression parameters is Ordinary Least Square (OLS). But in reality, there is often a problem with heteroscedasticity, namely the variance of the error is not constant or variable for all values of the independent variable X. This results in the OLS method being less effective. To overcome this, a parameter estimation method can be used by adding weight to each parameter, namely the Generalized Least Square (GLS) method. This study aims to examine the use of the GLS method in overcoming heteroscedasticity in regression analysis and examine the comparison of estimation results using the OLS method with the GLS method in the case of heteroscedasticity.The results show that the GLS method was able to maintain the nature of the estimator that is not biased and consistent and able to overcome the problem of heteroscedasticity, so that the GLS method is more effective than the OLS method.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

SUN, HONGWEI, and PING LIU. "REGULARIZED LEAST SQUARE ALGORITHM WITH TWO KERNELS." International Journal of Wavelets, Multiresolution and Information Processing 10, no. 05 (September 2012): 1250043. http://dx.doi.org/10.1142/s0219691312500439.

Повний текст джерела
Анотація:
A new multi-kernel regression learning algorithm is studied in this paper. In our setting, the hypothesis space is generated by two Mercer kernels, thus it has stronger approximation ability than the single kernel case. We provide the mathematical foundation for this regularized learning algorithm. We obtain satisfying capacity-dependent error bounds and learning rates by the covering number method.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Asar, Erdogan, and Erdem Karabulut. "Quasi-least squares regression method with dentistry data." Nigerian Journal of Clinical Practice 24, no. 6 (2021): 789. http://dx.doi.org/10.4103/njcp.njcp_346_20.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Tominaga, Yukio, and Iwao Fujiwara. "Prediction-weighted partial least-squares regression method (PWPLS)." Chemometrics and Intelligent Laboratory Systems 38, no. 2 (October 1997): 139–44. http://dx.doi.org/10.1016/s0169-7439(97)00043-9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

LIU, CHEN, HENRY SCHELLHORN, and QIDI PENG. "AMERICAN OPTION PRICING WITH REGRESSION: CONVERGENCE ANALYSIS." International Journal of Theoretical and Applied Finance 22, no. 08 (December 2019): 1950044. http://dx.doi.org/10.1142/s0219024919500444.

Повний текст джерела
Анотація:
The Longstaff–Schwartz (LS) algorithm is a popular least square Monte Carlo method for American option pricing. We prove that the mean squared sample error of the LS algorithm with quasi-regression is equal to [Formula: see text] asymptotically, a where [Formula: see text] is a constant, [Formula: see text] is the number of simulated paths. We suggest that the quasi-regression based LS algorithm should be preferred whenever applicable. Juneja & Kalra (2009) and Bolia & Juneja (2005) added control variates to the LS algorithm. We prove that the mean squared sample error of their algorithm with quasi-regression is equal to [Formula: see text] asymptotically, where [Formula: see text] is a constant and show that [Formula: see text] under mild conditions. We revisit the method of proof contained in Clément et al. [E. Clément, D. Lamberton & P. Protter (2002) An analysis of a least squares regression method for American option pricing, Finance and Stochastics, 6 449–471], but had to complete it, because of a small gap in their proof, which we also document in this paper.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Khalaf, Walaa, Calogero Pace, and Manlio Gaudioso. "Least Square Regression Method for Estimating Gas Concentration in an Electronic Nose System." Sensors 9, no. 3 (March 10, 2009): 1678–91. http://dx.doi.org/10.3390/s90301678.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Zheng, Jun, Xinyu Shao, Liang Gao, Ping Jiang, and Haobo Qiu. "Difference mapping method using least square support vector regression for variable-fidelity metamodelling." Engineering Optimization 47, no. 6 (May 22, 2014): 719–36. http://dx.doi.org/10.1080/0305215x.2014.918114.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Li, Xingfeng, Damien Coyle, Liam Maguire, and Thomas Martin McGinnity. "A Least Trimmed Square Regression Method for Second Level fMRI Effective Connectivity Analysis." Neuroinformatics 11, no. 1 (October 24, 2012): 105–18. http://dx.doi.org/10.1007/s12021-012-9168-8.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Kieu, Hai Dang, Hongchuan Yu, Zhuorong Li, and Jian Jun Zhang. "Locally weighted PCA regression to recover missing markers in human motion data." PLOS ONE 17, no. 8 (August 8, 2022): e0272407. http://dx.doi.org/10.1371/journal.pone.0272407.

Повний текст джерела
Анотація:
“Missing markers problem”, that is, missing markers during a motion capture session, has been raised for many years in Motion Capture field. We propose the locally weighted principal component analysis (PCA) regression method to deal with this challenge. The main merit is to introduce the sparsity of observation datasets through the multivariate tapering approach into traditional least square methods and develop it into a new kind of least square methods with the sparsity constraints. To the best of our knowledge, it is the first least square method with the sparsity constraints. Our experiments show that the proposed regression method can reach high estimation accuracy and has a good numerical stability.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

MAZIYYA, PUTU AYU, I. KOMANG GDE SUKARSA, and NI MADE ASIH. "MENGATASI HETEROSKEDASTISITAS PADA REGRESI DENGAN MENGGUNAKAN WEIGHTED LEAST SQUARE." E-Jurnal Matematika 4, no. 1 (January 30, 2015): 20. http://dx.doi.org/10.24843/mtk.2015.v04.i01.p083.

Повний текст джерела
Анотація:
In the regression analysis we need a method to estimate parameters to fulfill the BLUE characteristic. There are assumptions that must be fulfilled homoscedasticity one of which is a condition in which the assumption of error variance is constant (same), infraction from the assumptions homoskedasticity called heteroscedasticity. The Consequence of going heteroscedasticity can impact OLS estimators still fulfill the requirements of not biased, but the variant obtained becomes inefficient. So we need a method to solve these problems either by Weighted Least Square (WLS). The purpose of this study is to find out how to overcome heteroscedasticity in regression with WLS. Step of this research was do with the OLS analysis, then testing to see whether there heteroscedasticity problem with BPG method, the next step is to repair the beginning model by way of weighting the data an exact multiplier factor, then re-using the OLS procedure to the data that have been weighted, the last stage is back with a heteroscedasticity test BPG method, so we obtained the model fulfill the assumptions of homoskedasicity. Estimates indicate that the WLS method can resolve the heteroscedasticity, with exact weighting factors based on the distribution pattern of data seen.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

XU, YONG-LI, and DI-RONG CHEN. "LEARNING RATES OF REGULARIZED REGRESSION FOR FUNCTIONAL DATA." International Journal of Wavelets, Multiresolution and Information Processing 07, no. 06 (November 2009): 839–50. http://dx.doi.org/10.1142/s0219691309003288.

Повний текст джерела
Анотація:
The study of regularized learning algorithms is a very important issue and functional data analysis extends classical methods. We establish the learning rates of the least square regularized regression algorithm in reproducing kernel Hilbert space for functional data. With the iteration method, we obtain fast learning rate for functional data. Our result is a natural extension for least square regularized regression algorithm when the dimension of input data is finite.
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Shu, Tuo, and Zhi-Xia Yang. "Least Square Support Tensor Regression Machine Based on Submatrix of the Tensor." Mathematical Problems in Engineering 2017 (2017): 1–11. http://dx.doi.org/10.1155/2017/3818949.

Повний текст джерела
Анотація:
For tensor regression problem, a novel method, called least square support tensor regression machine based on submatrix of a tensor (LS-STRM-SMT), is proposed. LS-STRM-SMT is a method which can be applied to deal with tensor regression problem more efficiently. First, we develop least square support matrix regression machine (LS-SMRM) and propose a fixed point algorithm to solve it. And then LS-STRM-SMT for tensor data is proposed. Inspired by the relation between photochrome and the gray pictures, we reformulate the tensor sample training set and form the new model (LS-STRM-SMT) for tensor regression problem. With the introduction of projection matrices and another fixed point algorithm, we turn the LS-STRM-SMT model into several related LS-SMRM models which are solved by the algorithm for LS-SMRM. Since the fixed point algorithm is used twice while solving the LS-STRM-SMT problem, we call the algorithm dual fixed point algorithm (DFPA). Our method (LS-STRM-SMT) has been compared with several typical support tensor regression machines (STRMs). From theoretical point of view, our algorithm has less parameters and its computational complexity should be lower, especially when the rank of submatrix K is small. The numerical experiments indicate that our algorithm has a better performance.
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Lenggogeni, Sari. "Why Is The Partial Least Square Important For Tourism Studies." International Journal of Tourism, Heritage and Recreation Sport 1, no. 2 (December 30, 2019): 7–15. http://dx.doi.org/10.24036/ijthrs.v1i2.27.

Повний текст джерела
Анотація:
Although the multiple regression method has been applied to exploratory research on most tourism studies, there is lack of understanding on studies that present a well-justified rationale in choosing a robust statistical tool for data analysis. This research note aims to review why tourism researchers are encouraged to use the Partial Least Squares Structural Equation Modelling (PLS-SEM) method to address this research problem. This article provides rationale, comparisons among techniques for multiple regression-based papers and suggestions for tourism researchers to justify why PLS-SEM is important for exploratory studies.
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Yanuar, Ferra. "The Simulation Study to Test the Performance of Quantile Regression Method With Heteroscedastic Error Variance." CAUCHY 5, no. 1 (November 30, 2017): 36. http://dx.doi.org/10.18860/ca.v5i1.4209.

Повний текст джерела
Анотація:
<div><p class="Keywords">The purpose of this article was to describe the ability of the quantile regression method in overcoming the violation of classical assumptions. The classical assumptions that are violated in this study are variations of non-homogeneous error or heteroscedasticity. To achieve this goal, the simulated data generated with the design of certain data distribution. This study did a comparison between the models resulting from the use of the ordinary least squares and the quantile regression method to the same simulated data. Consistency of both methods was compared with conducting simulation studies as well. This study proved that the quantile regression method had standard error, confidence interval width and mean square error (MSE) value smaller than the ordinary least squares method. Thus it can be concluded that the quantile regression method is able to solve the problem of heteroscedasticity and produce better model than the ordinary least squares. In addition the ordinary least squares is not able to solve the problem of heteroscedasticity.</p></div>
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Sadiq, Maryam, Alanazi Talal Abdulrahman, Randa Alharbi, Dalia Kamal Fathi Alnagar, and Syed Masroor Anwar. "Modeling the Ranked Antenatal Care Visits Using Optimized Partial Least Square Regression." Computational and Mathematical Methods in Medicine 2022 (March 14, 2022): 1–8. http://dx.doi.org/10.1155/2022/2868885.

Повний текст джерела
Анотація:
The frequency and timing of antenatal care visits are observed to be the significant factors of infant and maternal morbidity and mortality. The present research is conducted to determine the risk factors of reduced antenatal care visits using an optimized partial least square regression model. A data set collected during 2017-2018 by Pakistan Demographic and Health Surveys is used for modeling purposes. The partial least square regression model coupled with rank correlation measures are introduced for improved performance to address ranked response. The proposed models included PLS ρ s , PLS τ A , PLS τ B , PLS τ C , PLS D , PLS τ G K , PLS G , and PLS U . Three filter-based factor selection methods are executed, and leave-one-out cross-validation by linear discriminant analysis is measured on predicted scores of all models. Finally, the Monte Carlo simulation method with 10 iterations of repeated sampling for optimization of validation performance is applied to select the optimum model. The standard and proposed models are executed over simulated and real data sets for efficiency comparison. The PLS ρ s is found to be the most appropriate proposed method to model the observed ranked data set of antenatal care visits based on validation performance. The optimal model selected 29 influential factors of inadequate use of antenatal care. The important factors of reduced antenatal care visits included women’s educational status, wealth index, total children ever born, husband’s education level, domestic violence, and history of cesarean section. The findings recommended that partial least square regression algorithms coupled with rank correlation coefficients provide more efficient estimates of ranked data in the presence of multicollinearity.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Prasetya, Rizka Pradita. "Unpacking Outlier with Weight Least Square (Implemented on Pepper Plantations Data)." Parameter: Journal of Statistics 2, no. 3 (January 4, 2023): 24–31. http://dx.doi.org/10.22487/27765660.2022.v2.i3.16138.

Повний текст джерела
Анотація:
Outliers in regression analysis can cause large residuals, the diversity of the data becomes greater, causing the data to be heterogenous. If an outlier is caused by an error in recording observations or an error in preparing equipment, the outlier can be ignored or discarded before data analysis is carried out. However, if outliers exist not because of the researcher's error, but are indeed information that cannot be provided by other data, then the outlier data cannot be ignored and must be included in data analysis. There are several methods to deal with outliers. The Weight Least Square method produces good results and is quite resistive to outliers. The WLS method is used to overcome the regression model with non-constant error variance, because WLS has the ability to neutralize the consequences of violating the normality assumption caused by the presence of outliers and can eliminate the nature of unusualness and consistency of the OLS estimate. To compare the level of estimator accuracy between regression models, the mean absolute percentage error (MAPE) is used. Based on the results of this study, it was concluded that the WLS method produced a smaller Mean Absolute Percentage Error value so that the use of this method was more appropriate because it was not susceptible to the effect of outliers.
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Çankaya, Soner, Samet Eker, and Samet Hasan Abacı. "Comparison of Least Squares, Ridge Regression and Principal Component Approaches in the Presence of Multicollinearity in Regression Analysis." Turkish Journal of Agriculture - Food Science and Technology 7, no. 8 (August 9, 2019): 1166. http://dx.doi.org/10.24925/turjaf.v7i8.1166-1172.2515.

Повний текст джерела
Анотація:
The aim of this study was to compare estimation methods: least squares method (LS), ridge regression (RR), Principal component regression (PCR) to estimate the parameters of multiple regression model in situations when the underlying assumptions of least squares estimation are untenable because of multicollinearity. For this aim, the effect of some body measurements on body weights (height at withers and rumps, body length, chest width, chest girth and chest depth, front, middle and hind rump width) obtained from totally 85 Karayaka lambs at weaning period raised at Research Farm of Ondokuz Mayis University was examined. Mean square error, R2 value and significance of parameters were used to evaluate estimator performance. The multicollinearity, between front and middle rump width which were used to estimate live weight, was eliminated by using RR and PCR. Although research findings showed that RR method had the smallest MSE and the highest R2 value, the estimates of PCR were determined to be more consistent when the importance tests of parameters were taken into account. The results showed that principal component regression approach should be used to estimate the live weight of Karayaka lambs at weaning period.
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Fransiska, Welly, Sigit Nugroho, and Ramya Rachmawati. "A Comparison of Weighted Least Square and Quantile Regression for Solving Heteroscedasticity in Simple Linear Regression." Journal of Statistics and Data Science 1, no. 1 (March 15, 2022): 19–29. http://dx.doi.org/10.33369/jsds.v1i1.21011.

Повний текст джерела
Анотація:
Regression analysis is the study of the relationship between dependent variable and one or more independent variables. One of the important assumption that must be fulfilled to get the regression coefficient estimator Best Linear Unbiased Estimator (BLUE) is homoscedasticity. If the homoscedasticity assumption is violated then it is called heteroscedasticity. The consequences of heteroscedasticity are the estimator remain linear and unbiased, but it can cause estimator haven‘t a minimum variance so the estimator is no longer BLUE. The purpose of this study is to analyze and resolve the violation of heteroscedasticity assumption with Weighted Least Square(WLS) and Quantile Regression. Based on the results of the comparison between WLS and Quantile Regression obtained the most precise method used to overcome heteroscedasticity in this research is the WLS method because it produces that is greater (98%).
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Yan, Cheng, Xiuli Shen, and Fushui Guo. "An improved support vector regression using least squares method." Structural and Multidisciplinary Optimization 57, no. 6 (December 13, 2017): 2431–45. http://dx.doi.org/10.1007/s00158-017-1871-5.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

González, Javier, Daniel Peña, and Rosario Romera. "A robust partial least squares regression method with applications." Journal of Chemometrics 23, no. 2 (February 2009): 78–90. http://dx.doi.org/10.1002/cem.1195.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Nisa, Khoirin, and Netti Herawati. "Robust Estimation of Generalized Estimating Equation when Data Contain Outliers." INSIST 2, no. 1 (March 22, 2017): 1. http://dx.doi.org/10.23960/ins.v2i1.23.

Повний текст джерела
Анотація:
Abstract—In this paper, a robust procedure for estimating parameters of regression model when generalized estimating equation (GEE) applied to longitudinal data that contains outliers is proposed. The method is called ‘iteratively reweighted least trimmed square’ (IRLTS) which is a combination of the iteratively reweighted least square (IRLS) and least trimmed square (LTS) methods. To assess the proposed method a simulation study was conducted and the result shows that the method is robust against outliers.Keywords—GEE, IRLS, LTS, longitudinal data, regression model.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Akhgari, Omid, and Mousa Golalizadeh. "The Improve of Two Stage Least Square Method in Regression Model with Endogenous Variables." Journal of Statistical Sciences 10, no. 2 (March 1, 2017): 203–20. http://dx.doi.org/10.18869/acadpub.jss.10.2.203.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Zhou, Huijuan, Yong Qin, and Yinghong Li. "A Partial Least Square Based Support Vector Regression Rail Transit Passenger Flow Prediction Method." International Journal of u- and e-Service, Science and Technology 7, no. 2 (April 30, 2014): 101–12. http://dx.doi.org/10.14257/ijunesst.2014.7.2.10.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Wang, Kuaini, Jingjing Zhang, Yanyan Chen, and Ping Zhong. "Least Absolute Deviation Support Vector Regression." Mathematical Problems in Engineering 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/169575.

Повний текст джерела
Анотація:
Least squares support vector machine (LS-SVM) is a powerful tool for pattern classification and regression estimation. However, LS-SVM is sensitive to large noises and outliers since it employs the squared loss function. To solve the problem, in this paper, we propose an absolute deviation loss function to reduce the effects of outliers and derive a robust regression model termed as least absolute deviation support vector regression (LAD-SVR). The proposed loss function is not differentiable. We approximate it by constructing a smooth function and develop a Newton algorithm to solve the robust model. Numerical experiments on both artificial datasets and benchmark datasets demonstrate the robustness and effectiveness of the proposed method.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Marvuglia, Antonino, Maurizio Cellura, and Marcello Pucci. "A Generalization of the Orthogonal Regression Technique for Life Cycle Inventory." International Journal of Agricultural and Environmental Information Systems 3, no. 1 (January 2012): 51–71. http://dx.doi.org/10.4018/jaeis.2012010105.

Повний текст джерела
Анотація:
Life cycle assessment (LCA) is a method used to quantify the environmental impacts of a product, process, or service across its whole life cycle. One of the problems occurring when the system at hand involves processes delivering more than one valuable output is the apportionment of resource consumption and environmental burdens in the correct proportion amongst the products. The mathematical formulation of the problem is represented by the solution of an over-determined system of linear equations. The paper describes the application of an iterative algorithm for the implementation of least square regression to solve this over-determined system directly in its rectangular form. The applied algorithm dynamically passes from an Ordinary Least Squares (OLS) problem to the regression problems known as Total Least Squares (TLS) and Data Least Squares (DLS). The obtained results suggest further investigations. In particular, the so called constrained least squares method is identified as an interesting development of the methodology.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Yassen, Mansour F., Fuad S. Al-Duais, and Mohammed M. A. Almazah. "Ridge Regression Method and Bayesian Estimators under Composite LINEX Loss Function to Estimate the Shape Parameter in Lomax Distribution." Computational Intelligence and Neuroscience 2022 (August 29, 2022): 1–10. http://dx.doi.org/10.1155/2022/1200611.

Повний текст джерела
Анотація:
In this paper, the Ridge Regression method is employed to estimate the shape parameter of the Lomax distribution (LD). In addition to that, the approaches of both classical and Bayesian are considered with several loss functions as a squared error (SELF), Linear Exponential (LLF), and Composite Linear Exponential (CLLF). As far as Bayesian estimators are concerned, informative and noninformative priors are used to estimate the shape parameter. To examine the performance of the Ridge Regression method, we compared it with classical estimators which included Maximum Likelihood, Ordinary Least Squares, Uniformly Minimum Variance Unbiased Estimator, and Median Method as well as Bayesian estimators. Monte Carlo simulation compares these estimators with respect to the Mean Square Error criteria (MSE’s). The result of the simulation mentioned that the Ridge Regression method is promising and can be used in a real environment. where it revealed better performance the than Ordinary Least Squares method for estimating shape parameter.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Silalahi, Derisman, and Edison Hulu. "INDIKATOR KOLEKTIBILITAS KREDIT JOINT FINANCING MENGGUNAKAN OLS & LOGIT." Jurnal Ilmu Keuangan dan Perbankan (JIKA) 11, no. 1 (December 30, 2021): 106–23. http://dx.doi.org/10.34010/jika.v11i1.5918.

Повний текст джерела
Анотація:
This study aims to determine indicators that have a significant effect on debtor collectability. The number of sample data used is 102 joint financing debtors whose loans were realized in 2019, which consists of two groups of 51 debtors that have current and non-current collectability. The analytical method used is the ordinary least square method and the logit regression method, where the combination of the two analytical methods being used at once was not yet found in previous studies. The analysis result using ordinary least squares shows three (3) significant indicators that affect collectability, namely term of the loan, value of collateral and monthly liabilities, with a significance level of 0.05. Meanwhile, the binary logit analysis results in four (4) significant indicators, namely term of the loan, value of collateral and monthly liabilities with a significance level of 0.05 and level of job risk with a significance level of 0.10. The R-squared value in the ordinary least square is 41%, which means that the ten indicators in this study simultaneously affect the collectability variable, while 59% is influenced by other variables not included in this study. The analysis using the logit regression method shows an R-square value of 39% which can be said that the ten indicators altogether affect debtor collectability by 39%.
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Dano Pati, Kafi. "Estimate The Parameters in Presence of Multicollinearity And Outliers Using Bisquare Weighted Ridge Least Median Squares Regression (wrlms)." Journal Of Duhok University 23, no. 2 (December 9, 2020): 9–24. http://dx.doi.org/10.26682/sjuod.2020.23.2.2.

Повний текст джерела
Анотація:
The presence of multicollinearity and outliers are classical problems of data within the linear regression framework. We are going to present a proposal of a new method which can be a potential candidate for robust ridge regression as well as a robust detection of multicollinearity. This proposal arises as a logical combination of principles used in the ridge regression and the Bisquare weighted function. The technique of the Least Median of Squares (LMS) is used for the sake of overcoming the resulting regression problems. This paper investigates the non-resistance of Ordinary Least Square (OLS) to multicollinearity and outliers and proposes the utilization of robust regression for instance, Least Median Squares LMS to detect non-normality of residuals, the use of robust methods yields more reliable trend estimations and outlier detection. LMS is introduced as a robust regression technique and through medical application its effect on regression is discussed. The numerical example and simulation study shows that the outcome of the Weighted Ridge Least Median Squares (WRLMS) is better than other estimators in terms of its efficiency. This has been done by utilizing both Standard Error (SE) and the Root Mean Squared Error criterion for the numerical example and simulation study, respectively as far as a lot of combinations of error distribution and degree of multicollinearity are concerned.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Yanke, Aldino, Nofrida Elly Zendrato, and Agus M. Soleh. "Handling Multicollinearity Problems in Indonesia's Economic Growth Regression Modeling Based on Endogenous Economic Growth Theory." Indonesian Journal of Statistics and Its Applications 6, no. 2 (August 31, 2022): 228–44. http://dx.doi.org/10.29244/ijsa.v6i2p214-230.

Повний текст джерела
Анотація:
One of the multiple linear regression applications in economics is Indonesia’s economic growth model based on the theory of endogenous economic growth. Endogenous economic theory is the development of classical theory which cannot explain how the economy grows in the long run. The regression model based on the theory of endogenous economic growth used many independent variables, which caused multicollinearity problems. In this study, the multiple linear regression model using the least-squares estimation method and some methods to handle the multicollinearity problem was implemented. Variable selection methods (backward, forward, and stepwise), principal component regression (PCR), partial least square (PLS), and regularization methods (Ridge, Lasso, and Elastic Net) were applied to solve the multicollinearity problem. Variable selection method with backward, forward, and stepwise has not been able to overcome the problem of multicollinearity. In contrast, Principal Component Regression, PLS regression, and regularization regression methods overcame the multicollinearity problem. We used "leave one out cross-validation" (LOOCV) to determine the best method for handling multicollinearity problems with the smallest mean square of error (MSE). Based on the MSE value, the best method to overcome the multicollinearity problem in the economic growth model based on endogenous economic growth theory was the Lasso regression method.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

UTAMI, NI KETUT TRI, and I. KOMANG GDE SUKARSA. "PENERAPAN METODE GENERALIZED RIDGE REGRESSION DALAM MENGATASI MASALAH MULTIKOLINEARITAS." E-Jurnal Matematika 2, no. 1 (January 30, 2013): 54. http://dx.doi.org/10.24843/mtk.2013.v02.i01.p029.

Повний текст джерела
Анотація:
Ordinary least square is parameter estimation method for linier regression analysis by minimizing residual sum of square. In the presence of multicollinearity, estimators which are unbiased and have a minimum variance can not be generated. Multicollinearity refers to a situation where regressor variables are highly correlated. Generalized Ridge Regression is an alternative method to deal with multicollinearity problem. In Generalized Ridge Regression, different biasing parameters for each regressor variables were added to the least square equation after transform the data to the space of orthogonal regressors. The analysis showed that Generalized Ridge Regression was satisfactory to overcome multicollinearity.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

IRAWAN, I. PUTU EKA, I. KOMANG GDE SUKARSA, and NI MADE ASIH. "PENERAPAN METODE LEAST MEDIAN SQUARE-MINIMUM COVARIANCE DETERMINANT (LMS-MCD) DALAM REGRESI KOMPONEN UTAMA." E-Jurnal Matematika 2, no. 4 (November 29, 2013): 6. http://dx.doi.org/10.24843/mtk.2013.v02.i04.p051.

Повний текст джерела
Анотація:
Principal Component Regression is a method to overcome multicollinearity techniques by combining principal component analysis with regression analysis. The calculation of classical principal component analysis is based on the regular covariance matrix. The covariance matrix is optimal if the data originated from a multivariate normal distribution, but is very sensitive to the presence of outliers. Alternatives are used to overcome this problem the method of Least Median Square-Minimum Covariance Determinant (LMS-MCD). The purpose of this research is to conduct a comparison between Principal Component Regression (RKU) and Method of Least Median Square - Minimum Covariance Determinant (LMS-MCD) in dealing with outliers. In this study, Method of Least Median Square - Minimum Covariance Determinant (LMS-MCD) has a bias and mean square error (MSE) is smaller than the parameter RKU. Based on the difference of parameter estimators, still have a test that has a difference of parameter estimators method LMS-MCD greater than RKU method.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії