Academic literature on the topic 'Nearest Mean Classifier'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Nearest Mean Classifier.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Nearest Mean Classifier"

1

Veenman, C. J., and M. J. T. Reinders. "The nearest subclass classifier: a compromise between the nearest mean and nearest neighbor classifier." IEEE Transactions on Pattern Analysis and Machine Intelligence 27, no. 9 (2005): 1417–29. http://dx.doi.org/10.1109/tpami.2005.187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chai, Jing, Hongwei Liu, Bo Chen, and Zheng Bao. "Large margin nearest local mean classifier." Signal Processing 90, no. 1 (2010): 236–48. http://dx.doi.org/10.1016/j.sigpro.2009.06.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Widyadhana, Arya, Cornelius Bagus Purnama Putra, Rarasmaya Indraswari, and Agus Zainal Arifin. "A Bonferroni Mean Based Fuzzy K Nearest Centroid Neighbor Classifier." Jurnal Ilmu Komputer dan Informasi 14, no. 1 (2021): 65–71. http://dx.doi.org/10.21609/jiki.v14i1.959.

Full text
Abstract:
K-nearest neighbor (KNN) is an effective nonparametric classifier that determines the neighbors of a point based only on distance proximity. The classification performance of KNN is disadvantaged by the presence of outliers in small sample size datasets and its performance deteriorates on datasets with class imbalance. We propose a local Bonferroni Mean based Fuzzy K-Nearest Centroid Neighbor (BM-FKNCN) classifier that assigns class label of a query sample dependent on the nearest local centroid mean vector to better represent the underlying statistic of the dataset. The proposed classifier is
APA, Harvard, Vancouver, ISO, and other styles
4

Et. al., M. Shanmuganathan,. "ROBUST K-NEAREST NEIGHBOR CLASSIFIER AND NEAREST MEAN CLASSIFIER BASED INFORMATIVE KNOWLEDGE DISTILLATION FACE RECOGNITION." INFORMATION TECHNOLOGY IN INDUSTRY 9, no. 2 (2021): 418–24. http://dx.doi.org/10.17762/itii.v9i2.365.

Full text
Abstract:
Human Face identification procedures have made tremendous progress in the last decade. Nevertheless, identifying faces with incomplete impediment is as yet challenging for the present face identifiers, and is very much required within certifiable application programs regarding reconnaissance and protection. Though great examination exertion has been dedicated to creating face de-impediment techniques, the greater part of them can just function admirably under obliged conditions. In this manuscript is proposed a Robust K-NNC (K-Nearest Neighbor Classifier) and NMC(Nearest Mean Classifier ) (RKN
APA, Harvard, Vancouver, ISO, and other styles
5

Mehta, Sumet, Xiangjun Shen, Jiangping Gou, and Dejiao Niu. "A New Nearest Centroid Neighbor Classifier Based on K Local Means Using Harmonic Mean Distance." Information 9, no. 9 (2018): 234. http://dx.doi.org/10.3390/info9090234.

Full text
Abstract:
The K-nearest neighbour classifier is very effective and simple non-parametric technique in pattern classification; however, it only considers the distance closeness, but not the geometricalplacement of the k neighbors. Also, its classification performance is highly influenced by the neighborhood size k and existing outliers. In this paper, we propose a new local mean based k-harmonic nearest centroid neighbor (LMKHNCN) classifier in orderto consider both distance-based proximity, as well as spatial distribution of k neighbors. In our method, firstly the k nearest centroid neighbors in each cl
APA, Harvard, Vancouver, ISO, and other styles
6

Sergioli, Giuseppe, Enrica Santucci, Luca Didaci, Jarosław A. Miszczak, and Roberto Giuntini. "A quantum-inspired version of the nearest mean classifier." Soft Computing 22, no. 3 (2017): 691–705. http://dx.doi.org/10.1007/s00500-016-2478-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gou, Jianping, Wenmo Qiu, Zhang Yi, Yong Xu, Qirong Mao, and Yongzhao Zhan. "A Local Mean Representation-based K -Nearest Neighbor Classifier." ACM Transactions on Intelligent Systems and Technology 10, no. 3 (2019): 1–25. http://dx.doi.org/10.1145/3319532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gou, J., Z. Yi, L. Du, and T. Xiong. "A Local Mean-Based k-Nearest Centroid Neighbor Classifier." Computer Journal 55, no. 9 (2012): 1058–71. http://dx.doi.org/10.1093/comjnl/bxr131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gou, Jianping, Hongxing Ma, Weihua Ou, Shaoning Zeng, Yunbo Rao, and Hebiao Yang. "A generalized mean distance-based k-nearest neighbor classifier." Expert Systems with Applications 115 (January 2019): 356–72. http://dx.doi.org/10.1016/j.eswa.2018.08.021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Masparudin, Masparudin, Abdullah Abdullah, and Usman Usman. "SISTEM PREDIKSI KUALITAS SANTAN KELAPA MENGGUNAKAN NEAREST MEAN CLASSIFIER (NMC)." SISTEMASI 9, no. 3 (2020): 646. http://dx.doi.org/10.32520/stmsi.v9i3.1015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Nearest Mean Classifier"

1

Baillo, Amparo, Antonio Cuevas, and Ricardo Fraiman. Classification methods for functional data. Edited by Frédéric Ferraty and Yves Romain. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199568444.013.10.

Full text
Abstract:
This article reviews the literature concerning supervised and unsupervised classification of functional data. It first explains the meaning of unsupervised classification vs. supervised classification before discussing the supervised classification problem in the infinite-dimensional case, showing that its formal statement generally coincides with that of discriminant analysis in the classical multivariate case. It then considers the optimal classifier and plug-in rules, empirical risk and empirical minimization rules, linear discrimination rules, the k nearest neighbor (k-NN) method, and kern
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Nearest Mean Classifier"

1

Skurichina, Marina, Liudmila I. Kuncheva, and Robert P. W. Duin. "Bagging and Boosting for the Nearest Mean Classifier: Effects of Sample Size on Diversity and Accuracy." In Multiple Classifier Systems. Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-45428-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Loog, Marco. "Constrained Parameter Estimation for Semi-supervised Learning: The Case of the Nearest Mean Classifier." In Machine Learning and Knowledge Discovery in Databases. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15883-4_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Alhammami, Muhammad, Chee Pun Ooi, and Wooi-Haw Tan. "Violence Recognition Using Harmonic Mean of Distances and Relational Velocity with K-Nearest Neighbour Classifier." In Advances in Visual Informatics. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25939-0_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Górecki, Tomasz, Maciej Łuczak, and Paweł Piasecki. "Similarity Forest for Time Series Classification." In Studies in Classification, Data Analysis, and Knowledge Organization. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-09034-9_19.

Full text
Abstract:
AbstractThe idea of similarity forest comes from Sathe and Aggarwal (Similarity forests, pp 395–403, 2017, [1]) and is derived from random forest. Random forests, during already 20 years of existence, proved to be one of the most excellent methods, showing top performance across a vast array of domains, preserving simplicity, time efficiency, still being interpretable at the same time. However, its usage is limited to multidimensional data. Similarity forest does not require such representation – it is only needed to compute similarities between observations. Thus, it may be applied to data, for which multidimensional representation is not available. In this paper, we propose the implementation of similarity forest for time series classification. We investigate 2 distance measures: Euclidean and dynamic time warping (DTW) as the underlying measure for the algorithm. We compare the performance of similarity forest with 1-nearest neighbor and random forest on the UCR (University of California, Riverside) benchmark database.We show that similarity forest with DTW, taking into account mean ranks, outperforms other classifiers. The comparison is enriched with statistical analysis.
APA, Harvard, Vancouver, ISO, and other styles
5

Yi, Jingrui, Qiang Rao, Qi Zhang, Junbao Zhang, Liya Song, and Hao Wu. "A Preliminary Study on the Effectiveness of K-Nearest Neighbor Algorithm Based on Local Mean for Classification of Bioinformatics Data." In Studies in Health Technology and Informatics. IOS Press, 2023. http://dx.doi.org/10.3233/shti230867.

Full text
Abstract:
The preliminary classification of biological class data is of great importance for bioinformatics. One can quickly classify object data by comparing their existing features with known traits. k-nearest neighbor algorithm is easy to apply in this field, but its drawbacks make it less meaningful to improve the efficiency of the algorithm by simply changing the distance model, so this study uses a local mean-based k-nearest neighbor classifier and compares the accuracy of the predicted classification of six different distance models used. The prediction accuracies in the experimental results were all greater than 70%, and the highest accuracy was achieved in different data sets for all distance models with K=2; the prediction accuracy of Minkowski distance with different parameters had the highest volatility in the test.and the experimental results can be used as a reference for related practitioners.
APA, Harvard, Vancouver, ISO, and other styles
6

Aggarwal, Ritu, and Suneet Kumar. "Missing Value Imputation and Estimation Methods for Arrhythmia Feature Selection Classification Using Machine Learning Algorithms." In Machine Learning Methods for Engineering Application Development. BENTHAM SCIENCE PUBLISHERS, 2022. http://dx.doi.org/10.2174/9879815079180122010013.

Full text
Abstract:
 Electrocardiogram signal analysis is very difficult to classify cardiac arrhythmia using machine learning methods. The ECG datasets normally come with multiple missing values. The reason for the missing values is the faults or distortion. When performing data mining, missing value imputation is the biggest task for data preprocessing. This problem could arise due to incomplete medical datasets if the incomplete missing values and cases were removed from the original database. To produce a good quality dataset for better analyzing the clinical trials, the suitable missing value imputation method is used. In this paper, we explore the different machine-learning techniques for the computed missing value in the electrocardiogram dataset. To estimate the missing imputation values, the collected data contains feature dimensions with their attributes. The experiments to compute the missing values in the dataset are carried out by using the four feature selection methods and imputation methods. The implemented results are shown by combined features using IG (information gain), GA (genetic algorithm) and the different machine learning classifiers such as NB (naïve bayes), KNN (K-nearest neighbor), MLP (Multilayer perception), and RF (Random forest). The GA (genetic algorithm) and IG (information gain) are the best suitable methods for obtaining the results on lower dimensional datasets with RMSE (Root mean square error. It efficiently calculates the best results for missing values. These four classifiers are used to analyze the impact of imputation methods. The best results for missing rate 10% to 40% are obtained by NB that is 0.657, 0.6541, 0.66, 0.657, and 0.657, as computed by RMSE (Root mean Square error). It means that error will efficiently reduced by naïve bayes classifier.
APA, Harvard, Vancouver, ISO, and other styles
7

Tyagi, Rinki, and Geetanjali . "RMSE Computation and Detection of Ring P. Falciparum." In Artificial Intelligence and Communication Technologies. Soft Computing Research Society, 2023. http://dx.doi.org/10.52458/978-81-955020-5-9-76.

Full text
Abstract:
Malaria is a disease that is threat to our life and it is caused by a parasite named protozoan. As we all studied that there are five species of malaria falciparum, p.vivax, p. malaria, p. ovale, and p. knowlesi. Some of the common symptoms of malaria are fever, fatigue, headache, cough, abdominal pain, nausea. Some of the causes of malaria are blood transfusion, maternal foetal transmission, by infected needle because of bite of female anopheles mosquito. Earlier number of method were used like malaria microscopy, RDT (Rapid Diagnostic Test). These tests were very costly so after their failure computer vision techniques came into limelight like image processing. Here microscopic images of blood samples were taken and using various techniques of image processing like preprocessing of an image, feature extraction, segmentation were used and were very popular in last few years but nowadays may machine learning classification models are used such as Naïve Bayesian classifier, Support vector machine classifier, K-nearest neighbor and many more and are very successful in predicting accuracy of detection of malaria parasite. In this paper we are going to detect malaria parasite and also which parasite it is with the help of a pre-processed image using region filling and then using canny edge detection for smoothening of edges in combination with watershed segmentation using distance transform and at last we will predict RMSE (Root Mean Square Error) using SVM classifier. Basically, in this paper we are trying to decrease the RMSE while detecting malaria parasite and we will predict the error using SVM classifier and also with Ensemble classifier and we will compare which one is better using regression analysis.
APA, Harvard, Vancouver, ISO, and other styles
8

Viswavardhan, Reddy K., B. Hemapriya, B. Roja Reddy, and B. S. Premananda. "Performance Evaluation of ML Algorithms for Disease Prediction Using DWT and EMD Techniques." In Computational Intelligence and Machine Learning Approaches in Biomedical Engineering and Health Care Systems. BENTHAM SCIENCE PUBLISHERS, 2022. http://dx.doi.org/10.2174/9781681089553122010009.

Full text
Abstract:
Information and communication technology usage in the healthcare sector is not perceptible due to various challenges with increased healthcare needs. With the outburst of COVID-19, when the different countries announced lockdown and social distancing rules, it is crucial to predict a person's symptoms, which will help in the early diagnosis. In such situations, there is a tremendous growth seen in the usage of various technologies, such as remote health monitoring, Wireless Body Area Networks (WBANs), Machine Learning (ML), and Decision Support system (DSS). Hence, the chapter focuses on detecting diseases and associated symptoms using various ML algorithms. A total of 3073 patient data (heartbeat, snore, and body temperature) has been collected. The collected data were preprocessed to remove empty cells and zero values by replacing the mean of the cells. Later, the extracted features were used in Empirical Mode Decomposition (EWD) and Discrete Wavelet Transformation (DWT). Then, the optimized algorithms with the threshold values were identified by consulting doctors for accurate disease prediction. With the testing performance of various ML algorithms, such as Decision Tree Classifier (DTC), K-Nearest Neighbor (KNN), Gradient Descent (SGD), Naive Bayes (NB), Multilayer perceptron (MLP), Support Vector Machine (SVM), and Random Forest (RF), was compared. Performance evaluation parameters are accuracy, precision, F1 score, and recall. The results showed an average of 100% accuracy with SGD and SVM with DWT, whereas EMD, SVM, and MLP outperformed the state-of-the-art algorithms with 99.83% accuracy.
APA, Harvard, Vancouver, ISO, and other styles
9

Suchindran, Pavithra, Vanithamani R., and Judith Justin. "Breast Cancer Detection Using Random Forest Classifier." In Research Anthology on Medical Informatics in Breast and Cervical Cancer. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-6684-7136-4.ch030.

Full text
Abstract:
Breast cancer is the second most prevalent type of cancer among women. Breast ultrasound (BUS) imaging is one of the most frequently used diagnostic tools to detect and classify abnormalities in the breast. To improve the diagnostic accuracy, computer-aided diagnosis (CAD) system is helpful for breast cancer detection and classification. Normally, a CAD system consists of four stages: pre-processing, segmentation, feature extraction, and classification. In this chapter, the pre-processing step includes speckle noise removal using speckle reducing anisotropic diffusion (SRAD) filter. The goal of segmentation is to locate the region of interest (ROI) and active contour-based segmentation and fuzzy C means segmentation (FCM) are used in this work. The texture features are extracted and fed to a classifier to categorize the images as normal, benign, and malignant. In this work, three classifiers, namely k-nearest neighbors (KNN) algorithm, decision tree algorithm, and random forest classifier, are used and the performance is compared based on the accuracy of classification.
APA, Harvard, Vancouver, ISO, and other styles
10

Suchindran, Pavithra, Vanithamani R., and Judith Justin. "Breast Cancer Detection Using Random Forest Classifier." In Handbook of Research on Deep Learning-Based Image Analysis Under Constrained and Unconstrained Environments. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-6690-9.ch005.

Full text
Abstract:
Breast cancer is the second most prevalent type of cancer among women. Breast ultrasound (BUS) imaging is one of the most frequently used diagnostic tools to detect and classify abnormalities in the breast. To improve the diagnostic accuracy, computer-aided diagnosis (CAD) system is helpful for breast cancer detection and classification. Normally, a CAD system consists of four stages: pre-processing, segmentation, feature extraction, and classification. In this chapter, the pre-processing step includes speckle noise removal using speckle reducing anisotropic diffusion (SRAD) filter. The goal of segmentation is to locate the region of interest (ROI) and active contour-based segmentation and fuzzy C means segmentation (FCM) are used in this work. The texture features are extracted and fed to a classifier to categorize the images as normal, benign, and malignant. In this work, three classifiers, namely k-nearest neighbors (KNN) algorithm, decision tree algorithm, and random forest classifier, are used and the performance is compared based on the accuracy of classification.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Nearest Mean Classifier"

1

Chen, Siying. "A Functional Data Classifier Based on Bonferroni Mean Fuzzy K-Nearest Centroid Neighbor." In 2024 6th International Conference on Communications, Information System and Computer Engineering (CISCE). IEEE, 2024. http://dx.doi.org/10.1109/cisce62493.2024.10653437.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mukahar, Nordiana, and Bakhtiar Affendi Rosdi. "Local Mean k-General Nearest Neighbor Classifier." In ICSCA 2021: 2021 10th International Conference on Software and Computer Applications. ACM, 2021. http://dx.doi.org/10.1145/3457784.3457828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Qingfeng, Ajit Puthenputhussery, and Chengjun Liu. "Novel general KNN classifier and general nearest mean classifier for visual classification." In 2015 IEEE International Conference on Image Processing (ICIP). IEEE, 2015. http://dx.doi.org/10.1109/icip.2015.7351113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jiang, Chunyu, Dian Liu, and Cheng Ma. "Interval-Valued Data Based Local Mean K-Nearest Neighbors Classifier." In 2023 IEEE 6th International Conference on Automation, Electronics and Electrical Engineering (AUTEEE). IEEE, 2023. http://dx.doi.org/10.1109/auteee60196.2023.10407827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Fangfei, Chenqing He, and Zhiming Chang. "An Improved Local Weighted Mean-Based k-Nearest Neighbor Classifier." In 2023 IEEE 6th International Conference on Automation, Electronics and Electrical Engineering (AUTEEE). IEEE, 2023. http://dx.doi.org/10.1109/auteee60196.2023.10408698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sun, Guohao, Yana Sun, and Fenfen Luo. "Generalized Minkowski Distance-Based Local Mean k-Nearest Neighbor Classifier." In 2023 IEEE International Conference on Image Processing and Computer Applications (ICIPCA). IEEE, 2023. http://dx.doi.org/10.1109/icipca59209.2023.10257763.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ma, Hongxing, Jianping Gou, Weihua Ou, Shaoning Zeng, Yunbo Rao, and Hebiao Yang. "A new nearest neighbor classifier based on multi-harmonic mean distances." In 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC). IEEE, 2017. http://dx.doi.org/10.1109/spac.2017.8304246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Enguang, Xiaojing Fu, and Tianyue Jiang. "Local Mean k-Nearest Neighbor Classifier with Gaussian-Kernel Distance Function." In 2023 IEEE International Conference on Image Processing and Computer Applications (ICIPCA). IEEE, 2023. http://dx.doi.org/10.1109/icipca59209.2023.10257845.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tran, Tuan Minh, Xuan-May Thi Le, Vo Thanh Vinh, Hien T. Nguyen, and Tuan M. Nguyen. "A Weighted Local Mean-Based k-Nearest Neighbors Classifier for Time Series." In ICMLC 2017: 2017 the 9th International Conference on Machine Learning and Computing. ACM, 2017. http://dx.doi.org/10.1145/3055635.3056594.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lei, Xin, Sichang Yang, and Longsheng Zhou. "Extensions of Local-Mean K-Nearest Neighbor Classifier with Various Distance Metrics." In 2023 IEEE 5th International Conference on Power, Intelligent Computing and Systems (ICPICS). IEEE, 2023. http://dx.doi.org/10.1109/icpics58376.2023.10235730.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!