Academic literature on the topic 'Trimmed Least Square'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Trimmed Least Square.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Trimmed Least Square"

1

Cao, Hui Rong, and Fu Chang Wang. "Integer-Coded Genetic Algorithm for Trimmed Estimator of Multivariate Linear Errors in Variables Model." Advanced Materials Research 457-458 (January 2012): 1223–29. http://dx.doi.org/10.4028/www.scientific.net/amr.457-458.1223.

Full text
Abstract:
The multivariate linear errors-in-variables (EIV) model is frequently used in computer vision for model fitting tasks. As well known, when sample data is contaminated by large numbers of awkwardly placed outliers, the least squares estimator isn’t robust. To obtain robust estimators of multivariate linear EIV model, orthogonal least trimmed square and orthogonal least trimmed absolute deviation estimators based on the subset of h cases(out of n)are proposed. However, these robust estimators possessing the exact fit property are NP-hard to compute. To tackle this problem, an integer-coded genetic algorithm that is applicable to trimmed estimators is presented. The trimmed estimators of multivariate linear EIV model on real data are provided and the results show that the integer-coded genetic algorithm is correct and effective.
APA, Harvard, Vancouver, ISO, and other styles
2

Nisa, Khoirin, and Netti Herawati. "Robust Estimation of Generalized Estimating Equation when Data Contain Outliers." INSIST 2, no. 1 (2017): 1. http://dx.doi.org/10.23960/ins.v2i1.23.

Full text
Abstract:
Abstract—In this paper, a robust procedure for estimating parameters of regression model when generalized estimating equation (GEE) applied to longitudinal data that contains outliers is proposed. The method is called ‘iteratively reweighted least trimmed square’ (IRLTS) which is a combination of the iteratively reweighted least square (IRLS) and least trimmed square (LTS) methods. To assess the proposed method a simulation study was conducted and the result shows that the method is robust against outliers.Keywords—GEE, IRLS, LTS, longitudinal data, regression model.
APA, Harvard, Vancouver, ISO, and other styles
3

Abdi, Hamdan, Sajaratud Dur, Rina Widyasar, and Ismail Husein. "Analysis of Efficiency of Least Trimmed Square and Least Median Square Methods in The Estimation of Robust Regression Parameters." ZERO: Jurnal Sains, Matematika dan Terapan 4, no. 1 (2020): 21. http://dx.doi.org/10.30829/zero.v4i1.7933.

Full text
Abstract:
<span lang="EN">Robust regression is a regression method used when the remainder's distribution is not reasonable, or there is an outreach to observational data that affects the model. One method for estimating regression parameters is the Least Squares Method (MKT). The method is easily affected by the presence of outliers. Therefore we need an alternative method that is robust to the presence of outliers, namely robust regression. Methods for estimating robust regression parameters include Least Trimmed Square (LTS) and Least Median Square (LMS). These methods are estimators with high breakdown points for outlier observational data and have more efficient algorithms than other estimation methods. This study aims to compare the regression models formed from the LTS and LMS methods, determine the efficiency of the model formed, and determine the factors that influence the production of community oil palm in Langkat District in 2018. The results showed that in testing, the estimated model of the regression parameters showed the same results. Compared to the efficiency estimator and the error square value, it was concluded that the LTS method was more efficient. Variable land area and productivity influence the production of palm oil smallholders in Langkat District in 2018. as well as the comparison of the efficiency estimator and the error square value, it was concluded that the LTS method was more efficient. Variable land area and productivity are factors that influence the production of palm oil smallholders in Langkat District in 2018. as well as the comparison of the efficiency estimator and the error square value, it was concluded that the LTS method was more efficient. Variable land area and productivity are factors that influence the production of palm oil smallholders in Langkat District in 2018</span>
APA, Harvard, Vancouver, ISO, and other styles
4

Shodiqin, Ali, Aurora Nur Aini, and Maya Rini Rubowo. "Perbanding Dua Metode Regresi Robust yakni Metode Least Trimmed Squares (LTS) dengan metode Estimator-MM (Estmasi-MM) (Studi Kasus Data Ujian Tulis Masuk Terhadap Hasil IPK Mahasiswa UPGRIS)." Jurnal Ilmiah Teknosains 4, no. 1 (2018): 35. http://dx.doi.org/10.26877/jitek.v4i1.2403.

Full text
Abstract:
Regresi linear ganda merupakan salah satu metode statistik yang dugunakan untuk memodelkan dan menyelidiki hubungan antar satu variabel dependen dengan dua atau lebih variabel inedependen. Ordinary Least Squares (OLS) merupakan metode yang sering digunakan untuk mengestimasi parameter model regresi. Namun metode ini mempunyai kelemahan ketika outlier hadir dalam data. Estimator OLS bukan merupakan prosedur regresi yang robust terhadap adanya outlier, sehingga estimasinya menjadi tidak sesuai meskipun hanya satu kehadiran outlier. Regresi robust merupakan alat yang penting untuk menganalisis data yang terdeteksi sebagai data outlier. Tujuan dari penelitian ini adalah mengetahui Pencilan (outlier) mengganggu persamaan regresi linier, mengetahui hasil penaksir regresi robust dengaan metode penaksir LTS (Least Trimmed Squares), mengetahui hasil penaksir regresi robust dengaan metode penaksir MM (MM–Estimator), serta mengetahui perbandingan antara dua penaksir regresi robust tersebut dengan melihat nilai dan residual masing-masing metode. Data yang digunakan dalam penelitian ini dari nilai ujian penerimaan mahasiswa baru dari Prodi Pendidikan Matematika di Universitas PGRI Semarang. Data ini terdiri merupakan data diskrit yang meliputi 3 (tiga) variabel yaitu nilai Tes (X1), Tes Psikologi (X2) sebagai variabel independen dan IPK (Y) sebagai variabel dependent. Sebelum dilakukan analisis dengan regresi robust, dilakukan pendeteksian outlier untuk mengindetifikasi adanya oulier atau tidak. Metode pendeteksian oulier dilakukan dengan beberapa, antara lain metode boxplot, Cook’s Distance, dan metode DfFIT (Difference In fit Standardized). Pada metode yang pertama dalam regresi robust Least Trimmed square (LTS) dihasilkan model regresi dan 0,127. Untuk persamaan regresi rebust dengan metode MM-Estimation diperoleh persamaan, yaitu dan =0,89304. Regresi robust merupakan metode yang sesuai untuk pendugaan parameter Penduga Least Trimmed Square (LTS) lebih efisien daripada metode MM-estimation. Hal ini didasarkan pada kriteria nilai dan residualnya, hal ini disebabakan adanya pemangkasan (trimmed) terhadap data yang mempunyai residual besar.
APA, Harvard, Vancouver, ISO, and other styles
5

Dornaika, Fadi, and Angel Sappa. "Instantaneous 3D motion from image derivatives using the Least Trimmed Square regression." Pattern Recognition Letters 30, no. 5 (2009): 535–43. http://dx.doi.org/10.1016/j.patrec.2008.12.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Xingfeng, Damien Coyle, Liam Maguire, and Thomas Martin McGinnity. "A Least Trimmed Square Regression Method for Second Level fMRI Effective Connectivity Analysis." Neuroinformatics 11, no. 1 (2012): 105–18. http://dx.doi.org/10.1007/s12021-012-9168-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dakhli, Abdesselem, Wajdi Bellil, and Chokri Ben amar. "Wavelet Neural Networks for DNA Sequence Classification Using the Genetic Algorithms and the Least Trimmed Square." Procedia Computer Science 96 (2016): 418–27. http://dx.doi.org/10.1016/j.procs.2016.08.088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kalina, Jan, and Jan Tichavský. "On Robust Estimation of Error Variance in (Highly) Robust Regression." Measurement Science Review 20, no. 1 (2020): 6–14. http://dx.doi.org/10.2478/msr-2020-0002.

Full text
Abstract:
AbstractThe linear regression model requires robust estimation of parameters, if the measured data are contaminated by outlying measurements (outliers). While a number of robust estimators (i.e. resistant to outliers) have been proposed, this paper is focused on estimating the variance of the random regression errors. We particularly focus on the least weighted squares estimator, for which we review its properties and propose new weighting schemes together with corresponding estimates for the variance of disturbances. An illustrative example revealing the idea of the estimator to down-weight individual measurements is presented. Further, two numerical simulations presented here allow to compare various estimators. They verify the theoretical results for the least weighted squares to be meaningful. MM-estimators turn out to yield the best results in the simulations in terms of both accuracy and precision. The least weighted squares (with suitable weights) remain only slightly behind in terms of the mean square error and are able to outperform the much more popular least trimmed squares estimator, especially for smaller sample sizes.
APA, Harvard, Vancouver, ISO, and other styles
9

Khamis, Azme, Nur Azreen Abdul Razak, and Mohd Asrul Affendi Abdullah. "A robust vector autoregressive model for forecasting economic growth in Malaysia." Malaysian Journal of Fundamental and Applied Sciences 14, no. 3 (2018): 382–85. http://dx.doi.org/10.11113/mjfas.v14n3.1021.

Full text
Abstract:
Economic indicator measures how solid or strong an economy of a country is. Basically, economic growth can be measured by using the economic indicators as they give an account of the quality or shortcoming of an economy. Vector Auto-regressive (VAR) method is commonly useful in forecasting the economic growth involving a bounteous of economic indicators. However, problems arise when its parameters are estimated using least square method which is very sensitive to the outliers existence. Thus, the aim of this study is to propose the best method in dealing with the outliers data so that the forecasting result is not biased. Data used in this study are the economic indicators monthly basis starting from January 1998 to January 2016. Two methods are considered, which are filtering technique via least median square (LMS), least trimmed square (LTS), least quartile difference (LQD) and imputation technique via mean and median. Using the mean absolute percentage error (MAPE) as the forecasting performance measure, this study concludes that Robust VAR with LQD filtering is a more appropriate model compare to others model.
APA, Harvard, Vancouver, ISO, and other styles
10

Matdoan, Muhammad Yahya. "PEMODELAN REGRESI ROBUST LEAST TRIMMED SQUARE (LTS) (Studi Kasus : Faktor -Faktor yang Mempengaruhi Penyebaran Penyakit Malaria di Indonesia)." Euclid 7, no. 2 (2020): 77. http://dx.doi.org/10.33603/e.v7i2.2926.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Trimmed Least Square"

1

Oliveira, Pedro Rodrigues de. "Um estudo dos determinantes da confiança interpessoal e seu impacto no crescimento econômico." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/96/96131/tde-28042008-172141/.

Full text
Abstract:
Na década de 1990, emergiu uma numerosa literatura abordando os efeitos da confiança interpessoal no crescimento econômico dos países. Teoricamente, a confiança afeta o crescimento econômico por afetar as decisões que envolvem incerteza acerca das ações futuras de outros agentes, como: investimentos, contratações de trabalhadores, inovação, dentre outras. Este trabalho utiliza a metodologia corrente nesta literatura, avaliando o papel da confiança no crescimento econômico em um cross section de países para três períodos, utilizando informações, principalmente, das Penn World Tables, World Values Survey e dados de educação da UNESCO. Aplicando a técnica de least trimmed squares é avaliada a robustez da variável confiança quando se retiram observações aberrantes. Encontra-se que a confiança tem um efeito considerável no crescimento econômico, mesmo quando outliers são removidos. Também são realizados exercícios para a correção de possíveis problemas de endogeneidade da variável de confiança. Além disso, o trabalho analisa os determinantes da confiança individual, utilizando um modelo probit cujas variáveis explicativas são: renda, escolaridade, idade, país, religião, dentre outras. Este exercício também é feito para analisar o caso brasileiro. Encontra-se que a confiança é uma variável que depende mais da sociedade ou do grupo que das características individuais e, para o caso brasileiro, verificou-se que independentemente de gênero, escolaridade ou renda, as pessoas não confiam nos demais.<br>In the 1990\'s a large number of works came out investigating the effects of interpersonal trust on the economic growth of countries. Theoretically, trust affects economic growth by affecting all decisions that involve uncertainty on future actions of other agents, such as: investments, hire of employees, innovation, among others. This study uses the current literature methodology, tackling the trust importance for economic growth on a cross section of countries for three periods, using informations mainly from the Penn World Tables, World Values Survey and educational data from UNESCO. Applying the least trimmed squares technique it is evaluated the robustness of the trust variable when influential observations are excluded. It is found a remarkable estimated effect of trust on economic growth, even when outliers are removed. Also some studies are made in order to correct for possible endogeneity problems of the trust variable. Moreover, the work analyses the determinants of individual trust, using a probit model with the regressors: income, schooling, age, country, religion, among others. This analysis is also applied for the brazilian case. It is found that trust depends more on the society or group than on individual characteristics and, for the brazilian case, it was observed that, no matter which gender, schooling or income level the person belongs to, people do not trust each other.
APA, Harvard, Vancouver, ISO, and other styles
2

Can, Mutan Oya. "Comparison Of Regression Techniques Via Monte Carlo Simulation." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12605175/index.pdf.

Full text
Abstract:
The ordinary least squares (OLS) is one of the most widely used methods for modelling the functional relationship between variables. However, this estimation procedure counts on some assumptions and the violation of these assumptions may lead to nonrobust estimates. In this study, the simple linear regression model is investigated for conditions in which the distribution of the error terms is Generalised Logistic. Some robust and nonparametric methods such as modified maximum likelihood (MML), least absolute deviations (LAD), Winsorized least squares, least trimmed squares (LTS), Theil and weighted Theil are compared via computer simulation. In order to evaluate the estimator performance, mean, variance, bias, mean square error (MSE) and relative mean square error (RMSE) are computed.
APA, Harvard, Vancouver, ISO, and other styles
3

Hu, Guan-Yi, and 胡冠儀. "Least Trimmed Square Support Vector Machine Regression and Its Applications." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/559zua.

Full text
Abstract:
碩士<br>國立虎尾科技大學<br>光電與材料科技研究所<br>99<br>There are many machine learning algorithms have developed since the ideal of artificial intelligence was proposed. Besides, in recent years, SVM among machine learning algorithms is generally used. Hence, many literatures about the support vector machine regression (SVMR) and the least squares-support vector machine regression (LS-SVMR) can be found in some well-known journals. In this thesis, for the robustness problem of the LS-SVMR, we propose the least trimmed squares support vector machine regression (LTS-SVMR) which is the hybrid of the least trimmed squares (LTS) and the LS-SVMR which is the improvement of the support vector machine regression (SVMR). Some literatures have pointed out that when the LTS method faces on the training sample with outliers, it can effectively remove outlier points. That is, robustness of the LS-SVMR is enhanced by combining the LS-SVMR and the LTS. However, the LTS method has one major drawback which is that the process of choosing the suitable initial function leads computation to be very large. For this problem we propose three methods of choosing the suitable initial function. First method is that an initial function is obtained by performing once estimation of the LS-SVMR before doing trimming process. Second method is that an optimal initial function is picked from training subsamples which are produced by random way before doing trimming process. Third method is that an optimal initial function is obtained by the simulated annealing (SA) algorithm before doing trimming process. In addition, in order to reducing process of complex computation, we propose the locally linear embeddings least trimmed squares support vector machine regression (LLE-LTS-SVMR) which combines our first method of the LTS-SVMR with the locally linear embeddings (LLE) which is the dimensionality reduction algorithm. Finally, experiment results show that three methods of the LTS-SVMR can improve the problem of low robustness of the LS-SVMR and the LTS-SVMR based on the SA algorithm is more reliable than other method of LTS-SVMR. Besides, the LLE-LTS-SVMR can reduce large computation process of first method of the LTS-SVMR and keep a good modeling ability.
APA, Harvard, Vancouver, ISO, and other styles
4

Lee, Heng-Wei, and 李恆瑋. "Improved Least Trimmed Square Support Vector Machine Regression for Biological Systems Modeling and Its Application on Smart Phone." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/cq3f36.

Full text
Abstract:
碩士<br>國立虎尾科技大學<br>資訊工程研究所<br>102<br>Machine learning approaches have been the rapid development in recent years. In general, Least Squares Support Vector Machine (LS-SVM) is used more commonly in the machine learning algorithms. In this thesis, we firstly improve to avoid sort in least trimmed square (LTS) method for reduce the computation time. Because the LTS can for excluding outlier sample data, it can enhance robustness of LS-SVMR. Hence, the improved LTS-SVMR under avoiding sort in LTS and statistical analysis is proposed in this thesis. However, the selection of the initial function affected by predicted results. Hence, different initial function methods to the improved LTS-SVMR are also proposed for modeling with noise data. The second part of this thesis is to propose asymmetric LTS-SVMR (ALTS-SVMR) approach to remove asymmetric noise and to do prediction under asymmetric noise. That is, we apply the Box-Cox transformation to the improved LTS-SVMR approaches that can deal with asymmetric noise. Finally, we apply the proposed methods for modeling of biological systems and implement the proposed method on android smart phone systems. Keywords: Noise and outliers, Robust, Least trimmed square, Systems biology, Asymmetric, Smart phone, Box-Cox transformation.
APA, Harvard, Vancouver, ISO, and other styles
5

Cheng, Wen-Chin, and 鄭文欽. "Study on Least Trimmed Squares Artificial Neural Networks." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/uku536.

Full text
Abstract:
碩士<br>國立中山大學<br>電機工程學系研究所<br>96<br>In this thesis, we study the least trimmed squares artificial neural networks (LTS-ANNs), which are generalization of the least trimmed squares (LTS) estimators frequently used in robust linear parametric regression problems to nonparametric artificial neural networks (ANNs) used for nonlinear regression problems. Two training algorithms are proposed in this thesis. The first algorithm is the incremental gradient descent algorithm. In order to speed up the convergence, the second training algorithm is proposed based on recursive least squares (RLS). Three illustrative examples are provided to test the performances of robustness against outliers for the classical ANNs and the LTS-ANNs. Simulation results show that upon proper selection of the trimming constant of the learning machines, LTS-ANNs are quite robust against outliers compared with the classical ANNs.
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Chun-yuan, and 利俊垣. "Trimmed Least Squares Estimation in the Errors-in-Variables Models." Thesis, 1993. http://ndltd.ncl.edu.tw/handle/04111685859856805454.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wu, I.-lin, and 吳易霖. "A Study on Particle Swarm Optimization based Resistant Fractal Image Compression using Least Trimmed Squares." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/91232641875182701078.

Full text
Abstract:
碩士<br>義守大學<br>資訊工程學系碩士班<br>97<br>Fractal image compression (FIC) is a lossy coding scheme. It possesses the advantages of high quality of retrieved image, zoom invariant, and high compression ratio. It is used for many applications recently in the filed of image reconstruction, watermark, medical image, feature recognition, and so on. However, if a corrupted image is encoded by FIC, the quality of retrieved image will be poor. Self-similarity and partitioned iterated function system are the underlying idea of FIC. Practically, we find the similarity between range blocks and domain blocks in the encoding process of FIC. The scheme of least squares is used to calculate contrast and brightness between a range block and a domain block in the conventional FIC. The least squares estimator is the best linear unbiased estimator under assumptions that the random variable is zero-mean, constant variance, and uncorrected random variable. As is well known in regression theory that linear regressor is sensitive to outliers. That’s reason why the quality of retrieved image will be poor. Robust regression is usually used against noises in regression theory. Least trimmed squares (LTS) method with resistance from the robust regression is proposed in this thesis. It is embedded into the encoding procedure of the FIC. Recursive weighted least squares with simple architecture and fast convergence is used in this thesis. To effectively improve the encoding speed, particle swarm optimization (PSO) is utilized to reduce the search space.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Trimmed Least Square"

1

Powell, James. Symmetrically trimmed least squares estimation for Tobit models. Dept. of Economics, Massachusetts Institute of Technology, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Baillo, Amparo, Antonio Cuevas, and Ricardo Fraiman. Classification methods for functional data. Edited by Frédéric Ferraty and Yves Romain. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199568444.013.10.

Full text
Abstract:
This article reviews the literature concerning supervised and unsupervised classification of functional data. It first explains the meaning of unsupervised classification vs. supervised classification before discussing the supervised classification problem in the infinite-dimensional case, showing that its formal statement generally coincides with that of discriminant analysis in the classical multivariate case. It then considers the optimal classifier and plug-in rules, empirical risk and empirical minimization rules, linear discrimination rules, the k nearest neighbor (k-NN) method, and kernel rules. It also describes classification based on partial least squares, classification based on reproducing kernels, and depth-based classification. Finally, it examines unsupervised classification methods, focusing on K-means for functional data, K-means for data in a Hilbert space, and impartial trimmed K-means for functional data. Some practical issues, in particular real-data examples and simulations, are reviewed and some selected proofs are given.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Trimmed Least Square"

1

Dong, Yuhua, and Jifeng Ding. "The Evolution Strategy Implementation of the Least Trimmed Square Algorithm." In Advances in Intelligent and Soft Computing. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-30223-7_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dornaika, Fadi, and Angel D. Sappa. "3D Motion from Image Derivatives Using the Least Trimmed Square Regression." In Advances in Machine Vision, Image Processing, and Pattern Analysis. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11821045_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

JunShan, Li, Han XianFeng, Li Long, Li Kun, and Li JianJun. "A Edge Feature Matching Algorithm Based on Evolutionary Strategies and Least Trimmed Square Hausdorff Distance." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11881070_65.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Čížek, Pavel, and Jan Ámos Víšek. "Least Trimmed Squares." In XploRe® - Application Guide. Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/978-3-642-57292-0_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Beliakov, Gleb, Marek Gagolewski, and Simon James. "Least Median of Squares (LMS) and Least Trimmed Squares (LTS) Fitting for the Weighted Arithmetic Mean." In Communications in Computer and Information Science. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91476-3_31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pan, Lili, Mei Xie, Tao Zheng, and Jianli Ren. "A Robust Iris Localization Model Based on Phase Congruency and Least Trimmed Squares Estimation." In Image Analysis and Processing – ICIAP 2009. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04146-4_73.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

"Least-Trimmed Square." In Encyclopedia of Genetics, Genomics, Proteomics and Informatics. Springer Netherlands, 2008. http://dx.doi.org/10.1007/978-1-4020-6754-9_9281.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Trimmed Least Square"

1

Haiyan Shi, Zhongliang Jing, and H. Leung. "A constrained least square and trimmed least square method for multisensor data fusion." In Proceedings of 2003 International Conference on Neural Networks and Signal Processing. IEEE, 2003. http://dx.doi.org/10.1109/icnnsp.2003.1279413.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Guanghua, Qinghua Dai, Xiao Tang, and Zihao Xu. "An Improved Least Trimmed Square Hausdorff Distance Finger Vein Recognition." In 2018 5th International Conference on Systems and Informatics (ICSAI). IEEE, 2018. http://dx.doi.org/10.1109/icsai.2018.8599439.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dakhli, Abdesselem, Maher Jbeli, and Chokri Ben Amar. "Functions Approximation using Multi Library Wavelets and Least Trimmed Square (LTS) Method." In 22nd International Conference on Enterprise Information Systems. SCITEPRESS - Science and Technology Publications, 2020. http://dx.doi.org/10.5220/0009802604680477.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bai, Kun, and Yuehuan Wang. "A least trimmed square method for clutter removal in infrared small target detection." In Eighth International Symposium on Multispectral Image Processing and Pattern Recognition, edited by Tianxu Zhang and Nong Sang. SPIE, 2013. http://dx.doi.org/10.1117/12.2031405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chang, Jyh-Yeong, Shih-Hui Liao, and Chin-Teng Lin. "Adaptive least trimmed squares fuzzy neural network." In 2012 International Conference on Fuzzy Theory and it's Applications (iFUZZY). IEEE, 2012. http://dx.doi.org/10.1109/ifuzzy.2012.6409741.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Jeng, Jin-Tsong, Chi-Ta Chuang, and Chen-Chia Chuang. "Least trimmed squares based CPBUM neural networks." In 2011 International Conference on System Science and Engineering (ICSSE). IEEE, 2011. http://dx.doi.org/10.1109/icsse.2011.5961897.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ji, Hong, Xiaohan Yang, and Badong Chen. "Trimmed diffusion least mean squares for distributed estimation." In 2015 IEEE International Conference on Digital Signal Processing (DSP). IEEE, 2015. http://dx.doi.org/10.1109/icdsp.2015.7251953.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hsu-Kun Wu, Jer-Guang Hsieh, and Ker-Wei Yu. "Study on least trimmed squares fuzzy neural networks." In 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE). IEEE, 2010. http://dx.doi.org/10.1109/iske.2010.5680809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cheng, Zunping, and Neil Hurley. "Robust Collaborative Recommendation by Least Trimmed Squares Matrix Factorization." In 2010 22nd International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2010. http://dx.doi.org/10.1109/ictai.2010.90.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bao Xin and Dai Liankui. "Nonlinear robust modeling base on least trimmed squares regression." In 2008 7th World Congress on Intelligent Control and Automation. IEEE, 2008. http://dx.doi.org/10.1109/wcica.2008.4594542.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Trimmed Least Square"

1

Xu, Qianqian, Ming Yan, and Yuan Yao. Fast Adaptive Least Trimmed Squares for Robust Evaluation of Quality of Experience. Defense Technical Information Center, 2014. http://dx.doi.org/10.21236/ada610266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography