To see the other types of publications on this topic, follow the link: Nearest neighbors classifier.

Journal articles on the topic 'Nearest neighbors classifier'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Nearest neighbors classifier.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Mehta, Sumet, Xiangjun Shen, Jiangping Gou, and Dejiao Niu. "A New Nearest Centroid Neighbor Classifier Based on K Local Means Using Harmonic Mean Distance." Information 9, no. 9 (2018): 234. http://dx.doi.org/10.3390/info9090234.

Full text
Abstract:
The K-nearest neighbour classifier is very effective and simple non-parametric technique in pattern classification; however, it only considers the distance closeness, but not the geometricalplacement of the k neighbors. Also, its classification performance is highly influenced by the neighborhood size k and existing outliers. In this paper, we propose a new local mean based k-harmonic nearest centroid neighbor (LMKHNCN) classifier in orderto consider both distance-based proximity, as well as spatial distribution of k neighbors. In our method, firstly the k nearest centroid neighbors in each cl
APA, Harvard, Vancouver, ISO, and other styles
2

Widyadhana, Arya, Cornelius Bagus Purnama Putra, Rarasmaya Indraswari, and Agus Zainal Arifin. "A Bonferroni Mean Based Fuzzy K Nearest Centroid Neighbor Classifier." Jurnal Ilmu Komputer dan Informasi 14, no. 1 (2021): 65–71. http://dx.doi.org/10.21609/jiki.v14i1.959.

Full text
Abstract:
K-nearest neighbor (KNN) is an effective nonparametric classifier that determines the neighbors of a point based only on distance proximity. The classification performance of KNN is disadvantaged by the presence of outliers in small sample size datasets and its performance deteriorates on datasets with class imbalance. We propose a local Bonferroni Mean based Fuzzy K-Nearest Centroid Neighbor (BM-FKNCN) classifier that assigns class label of a query sample dependent on the nearest local centroid mean vector to better represent the underlying statistic of the dataset. The proposed classifier is
APA, Harvard, Vancouver, ISO, and other styles
3

Shaul, Hayim, Dan Feldman, and Daniela Rus. "Secure k-ish Nearest Neighbors Classifier." Proceedings on Privacy Enhancing Technologies 2020, no. 3 (2020): 42–61. http://dx.doi.org/10.2478/popets-2020-0045.

Full text
Abstract:
AbstractThe k-nearest neighbors (kNN) classifier predicts a class of a query, q, by taking the majority class of its k neighbors in an existing (already classified) database, S. In secure kNN, q and S are owned by two different parties and q is classified without sharing data. In this work we present a classifier based on kNN, that is more efficient to implement with homomorphic encryption (HE). The efficiency of our classifier comes from a relaxation we make to consider κ nearest neighbors for κ ≈k with probability that increases as the statistical distance between Gaussian and the distributi
APA, Harvard, Vancouver, ISO, and other styles
4

Onyezewe, Anozie, Armand F. Kana, Fatimah B. Abdullahi, and Aminu O. Abdulsalami. "An Enhanced Adaptive k-Nearest Neighbor Classifier Using Simulated Annealing." International Journal of Intelligent Systems and Applications 13, no. 1 (2021): 34–44. http://dx.doi.org/10.5815/ijisa.2021.01.03.

Full text
Abstract:
The k-Nearest Neighbor classifier is a non-complex and widely applied data classification algorithm which does well in real-world applications. The overall classification accuracy of the k-Nearest Neighbor algorithm largely depends on the choice of the number of nearest neighbors(k). The use of a constant k value does not always yield the best solutions especially for real-world datasets with an irregular class and density distribution of data points as it totally ignores the class and density distribution of a test point’s k-environment or neighborhood. A resolution to this problem is to dyna
APA, Harvard, Vancouver, ISO, and other styles
5

Mendes Júnior, Pedro R., Roberto M. de Souza, Rafael de O. Werneck, et al. "Nearest neighbors distance ratio open-set classifier." Machine Learning 106, no. 3 (2016): 359–86. http://dx.doi.org/10.1007/s10994-016-5610-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Puchkin, Nikita, and Vladimir Spokoiny. "An adaptive multiclass nearest neighbor classifier." ESAIM: Probability and Statistics 24 (2020): 69–99. http://dx.doi.org/10.1051/ps/2019021.

Full text
Abstract:
We consider a problem of multiclass classification, where the training sample S_n={(Xi,Yi)}ni=1 is generated from the model ℙ(Y = m|X = x) = ηm(x), 1 ≤ m ≤ M, and η1(x), …, ηM(x) are unknown α-Holder continuous functions. Given a test point X, our goal is to predict its label. A widely used k-nearest-neighbors classifier constructs estimates of η1(X), …, ηM(X) and uses a plug-in rule for the prediction. However, it requires a proper choice of the smoothing parameter k, which may become tricky in some situations. We fix several integers n1, …, nK, compute corresponding nk-nearest-neighbor estim
APA, Harvard, Vancouver, ISO, and other styles
7

MITICHE, A., and J. K. AGGARWAL. "PATTERN CATEGORY ASSIGNMENT BY NEURAL NETWORKS AND NEAREST NEIGHBORS RULE: A SYNOPSIS AND A CHARACTERIZATION." International Journal of Pattern Recognition and Artificial Intelligence 10, no. 05 (1996): 393–408. http://dx.doi.org/10.1142/s0218001496000268.

Full text
Abstract:
The purpose of this paper is two-fold: to give a synoptic description of favored neural networks and to characterize the potency of these neural networks as pattern classifiers, against the background of the familiar nearest neighbors classification. We limit the study to those neural network structures most commonly used for pattern classification: the multilayer perceptron, the Kohonen associative memory, and the Carpenter–Grossberg clustering network, for which we give a tutorial description with the aim of making the driving concepts apparent. The nearest neighbors rule is presented with i
APA, Harvard, Vancouver, ISO, and other styles
8

Roh, Seok-Beom, and Tae-Chon Ahn. "Design of Lazy Classifier based on Fuzzy k-Nearest Neighbors and Reconstruction Error." Journal of Korean Institute of Intelligent Systems 20, no. 1 (2010): 101–8. http://dx.doi.org/10.5391/jkiis.2010.20.1.101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

WEN, Zhi-qiang, Yong-xiang HU, and Wen-qiu ZHU. "k-nearest neighbors classifier over manifolds." Journal of Computer Applications 32, no. 12 (2013): 3311–14. http://dx.doi.org/10.3724/sp.j.1087.2012.03311.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ati, Indri, and Ari Kusyanti. "Metode Ensemble Classifier untuk Mendeteksi Jenis Attention Deficit Hyperactivity Disorder (SDHD) pada Anak Usia Dini." Jurnal Teknologi Informasi dan Ilmu Komputer 6, no. 3 (2019): 301. http://dx.doi.org/10.25126/jtiik.2019631313.

Full text
Abstract:
<p class="Abstract">Pada awal masa perkembangan, beberapa anak mengalami hambatan diantaranya sulit untuk diam, sulit untuk berkonsentrasi dan mengontrol perilakunya, apabila anak mengalami gangguan pemusatan perhatian dan sulit mengontrol perilaku yang sesuai, dapat disebut dengan ADHD (Attention Deficit Hyperactive Disorder). Ini merupakan masalah yang serius dikarenakan anak penyandang ADHD mengalami masalah perilaku sosial, emosional dan mengalami kesulitan belajar sekolah sehingga akan mempengaruhi perkembangan pada masa dewasa anak penyandang ADHD. Oleh karena itu perlu diketahui g
APA, Harvard, Vancouver, ISO, and other styles
11

Kiss, Tamás, Stephen Morairty, Michael Schwartz, Thomas Kilduff, Derek Buhl, and Dmitri Volfson. "Automated Sleep Stage Scoring Using k-Nearest Neighbors Classifier." Journal of Open Source Software 5, no. 53 (2020): 2377. http://dx.doi.org/10.21105/joss.02377.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Anthony, Martin, and Joel Ratsaby. "A hybrid classifier based on boxes and nearest neighbors." Discrete Applied Mathematics 172 (July 2014): 1–11. http://dx.doi.org/10.1016/j.dam.2014.02.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

LUO, FULIN, JIAMIN LIU, HONG HUANG, and YUMEI LIU. "HYPERSPECTRAL IMAGE CLASSIFICATION USING LOCAL SPECTRAL ANGLE-BASED MANIFOLD LEARNING." International Journal of Pattern Recognition and Artificial Intelligence 28, no. 06 (2014): 1450016. http://dx.doi.org/10.1142/s0218001414500165.

Full text
Abstract:
Locally linear embedding (LLE) depends on the Euclidean distance (ED) to select the k-nearest neighbors. However, the ED may not reflect the actual geometry structure of data, which may lead to the selection of ineffective neighbors. The aim of our work is to make full use of the local spectral angle (LSA) to find proper neighbors for dimensionality reduction (DR) and classification of hyperspectral remote sensing data. At first, we propose an improved LLE method, called local spectral angle LLE (LSA-LLE), for DR. It uses the ED of data to obtain large-scale neighbors, then utilizes the spectr
APA, Harvard, Vancouver, ISO, and other styles
14

Taha Chicho, Bahzad, Adnan Mohsin Abdulazeez, Diyar Qader Zeebaree, and Dilovan Assad Zebari. "Machine Learning Classifiers Based Classification For IRIS Recognition." Qubahan Academic Journal 1, no. 2 (2021): 106–18. http://dx.doi.org/10.48161/qaj.v1n2a48.

Full text
Abstract:
Classification is the most widely applied machine learning problem today, with implementations in face recognition, flower classification, clustering, and other fields. The goal of this paper is to organize and identify a set of data objects. The study employs K-nearest neighbors, decision tree (j48), and random forest algorithms, and then compares their performance using the IRIS dataset. The results of the comparison analysis showed that the K-nearest neighbors outperformed the other classifiers. Also, the random forest classifier worked better than the decision tree (j48). Finally, the best
APA, Harvard, Vancouver, ISO, and other styles
15

Camacho-Urriolagoitia, Oscar, Itzamá López-Yáñez, Yenny Villuendas-Rey, Oscar Camacho-Nieto, and Cornelio Yáñez-Márquez. "Dynamic Nearest Neighbor: An Improved Machine Learning Classifier and Its Application in Finances." Applied Sciences 11, no. 19 (2021): 8884. http://dx.doi.org/10.3390/app11198884.

Full text
Abstract:
The presence of machine learning, data mining and related disciplines is increasingly evident in everyday environments. The support for the applications of learning techniques in topics related to economic risk assessment, among other financial topics of interest, is relevant for us as human beings. The content of this paper consists of a proposal of a new supervised learning algorithm and its application in real world datasets related to finance, called D1-NN (Dynamic 1-Nearest Neighbor). The D1-NN performance is competitive against the main state of the art algorithms in solving finance-rela
APA, Harvard, Vancouver, ISO, and other styles
16

Sokolov, Anton, Egor Dmitriev, Cyril Gengembre, and Hervé Delbarre. "Automated Classification of Regional Meteorological Events in a Coastal Area Using In Situ Measurements." Journal of Atmospheric and Oceanic Technology 37, no. 4 (2020): 723–39. http://dx.doi.org/10.1175/jtech-d-19-0120.1.

Full text
Abstract:
AbstractThe problem is considered of atmospheric meteorological events’ classification, such as sea breezes, fogs, and high winds, in coastal areas. In situ wind, temperature, humidity, pressure, radiance, and turbulence meteorological measurements are used as predictors. Local atmospheric events of 2013–14 were analyzed and classified manually using data of the measurement campaign in the coastal area of the English Channel in Dunkirk, France. The results of that categorization allowed the training of a few supervised classification algorithms using the data of an ultrasonic anemometer as pre
APA, Harvard, Vancouver, ISO, and other styles
17

Mansor, Muhammad Naufal, Ahmad Kadri Junoh, Wan Suhana Wan Daud, Wan Zuki Azman Wan Muhamad, and Azrini Idris. "Nonlinear Fuzzy Robust PCA Algorithm for Pain Decision Support System." Advanced Materials Research 1016 (August 2014): 785–89. http://dx.doi.org/10.4028/www.scientific.net/amr.1016.785.

Full text
Abstract:
This paper describes particular pain events to be located as in infant face images with feature extraction algorithm. Nonlinear Fuzzy Robust PCA (NFRPCA) feature extraction is implemented to test its effectiveness in recognizing pain in images. In this work, two classifiers, Fuzzy k-nearest neighbors (Fuzzy k-NN) and k-nearest neighbors (k-NN) are employed. Result shows that the NFRPCA and classifier (Fuzzy k-NN and k-NN) can be used for the recognition of infant pain images with the best accuracy of 89.77%.
APA, Harvard, Vancouver, ISO, and other styles
18

Wei, Jiangshu, Xiangjun Qi, and Mantao Wang. "Collaborative Representation Classifier Based on K Nearest Neighbors for Classification." Journal of Software Engineering 9, no. 1 (2014): 96–104. http://dx.doi.org/10.3923/jse.2015.96.104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Akbulut, Yaman, Abdulkadir Sengur, Yanhui Guo, and Florentin Smarandache. "NS-k-NN: Neutrosophic Set-Based k-Nearest Neighbors Classifier." Symmetry 9, no. 9 (2017): 179. http://dx.doi.org/10.3390/sym9090179.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Na, Junying, Zhiping Wang, Siqi Lv, and Zhaohui Xu. "An Extended K Nearest Neighbors-Based Classifier for Epilepsy Diagnosis." IEEE Access 9 (2021): 73910–23. http://dx.doi.org/10.1109/access.2021.3081767.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Mukahar, Nordiana, and Bakhtiar Affendi Rosdi. "Interval valued fuzzy setsk-nearest neighbors classifier for finger vein recognition." Journal of Physics: Conference Series 890 (September 2017): 012069. http://dx.doi.org/10.1088/1742-6596/890/1/012069.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Glybovets, M. M., K. V. Salata, and N. A. Tkach. "Construction of diagnostic expert-medical system using neural networks." PROBLEMS IN PROGRAMMING, no. 2-3 (September 2020): 384–91. http://dx.doi.org/10.15407/pp2020.02-03.384.

Full text
Abstract:
In the article was discussed the methods (decision trees, deep learning algorithms, k-nearest neighbors, neural networks) to create diagnostic expert medical systems. For practice part were developed diagnostic API based on chosen classifiers that implement the algorithms and a study of their work was conducted. Namely, classifiers based on neural networks, decision trees and k-nearest neighbors method were compared. The parameters for the selected classifier were optimized. As a result, were selected parameters on which the data were researched. In addition, the dataset of information of pati
APA, Harvard, Vancouver, ISO, and other styles
23

Waykar, Sanjay B., and C. R. Bharathi. "Multimodal Features and Probability Extended Nearest Neighbor Classification for Content-Based Lecture Video Retrieval." Journal of Intelligent Systems 26, no. 3 (2017): 585–99. http://dx.doi.org/10.1515/jisys-2016-0041.

Full text
Abstract:
AbstractDue to the ever-increasing number of digital lecture libraries and lecture video portals, the challenge of retrieving lecture videos has become a very significant and demanding task in recent years. Accordingly, the literature presents different techniques for video retrieval by considering video contents as well as signal data. Here, we propose a lecture video retrieval system using multimodal features and probability extended nearest neighbor (PENN) classification. There are two modalities utilized for feature extraction. One is textual information, which is determined from the lectu
APA, Harvard, Vancouver, ISO, and other styles
24

K, Venkatachalam, and Karthikeyan NK. "Effective Feature Set Selection and Centroid Classifier Algorithm for Web Services Discovery." Indonesian Journal of Electrical Engineering and Computer Science 5, no. 2 (2017): 441. http://dx.doi.org/10.11591/ijeecs.v5.i2.pp441-450.

Full text
Abstract:
<p>Text preprocessing and document classification plays a vital role in web services discovery. Nearest centroid classifiers were mostly employed in high-dimensional application including genomics. Feature selection is a major problem in all classifiers and in this paper we propose to use an effective feature selection procedure followed by web services discovery through Centroid classifier algorithm. The task here in this problem statement is to effectively assign a document to one or more classes. Besides being simple and robust, the centroid classifier s not effectively used for docum
APA, Harvard, Vancouver, ISO, and other styles
25

Saadatfar, Hamid, Samiyeh Khosravi, Javad Hassannataj Joloudari, Amir Mosavi, and Shahaboddin Shamshirband. "A New K-Nearest Neighbors Classifier for Big Data Based on Efficient Data Pruning." Mathematics 8, no. 2 (2020): 286. http://dx.doi.org/10.3390/math8020286.

Full text
Abstract:
The K-nearest neighbors (KNN) machine learning algorithm is a well-known non-parametric classification method. However, like other traditional data mining methods, applying it on big data comes with computational challenges. Indeed, KNN determines the class of a new sample based on the class of its nearest neighbors; however, identifying the neighbors in a large amount of data imposes a large computational cost so that it is no longer applicable by a single computing machine. One of the proposed techniques to make classification methods applicable on large datasets is pruning. LC-KNN is an imp
APA, Harvard, Vancouver, ISO, and other styles
26

TOUSSAINT, GODFRIED. "GEOMETRIC PROXIMITY GRAPHS FOR IMPROVING NEAREST NEIGHBOR METHODS IN INSTANCE-BASED LEARNING AND DATA MINING." International Journal of Computational Geometry & Applications 15, no. 02 (2005): 101–50. http://dx.doi.org/10.1142/s0218195905001622.

Full text
Abstract:
In the typical nonparametric approach to classification in instance-based learning and data mining, random data (the training set of patterns) are collected and used to design a decision rule (classifier). One of the most well known such rules is the k-nearest-neighbor decision rule (also known as lazy learning) in which an unknown pattern is classified into the majority class among its k nearest neighbors in the training set. Several questions related to this rule have received considerable attention over the years. Such questions include the following. How can the storage of the training set
APA, Harvard, Vancouver, ISO, and other styles
27

AN, JIAN, XIAOLIN GUI, JIANWEI YANG, JINHUA JIANG, and LING QI. "SEMI-SUPERVISED LEARNING OF K-NEAREST NEIGHBORS USING A NEAREST-NEIGHBOR SELF-CONTAINED CRITERION IN FOR MOBILE-AWARE SERVICE." International Journal of Pattern Recognition and Artificial Intelligence 27, no. 05 (2013): 1351001. http://dx.doi.org/10.1142/s0218001413510014.

Full text
Abstract:
We propose a new K-nearest neighbor (KNN) algorithm based on a nearest-neighbor self-contained criterion (NNscKNN) by utilizing the unlabeled data information. Our algorithm incorporates other discriminant information to train KNN classifier. This new KNN scheme is also applied in a community detection algorithm for mobile-aware service: First, as the edges of networks, the social relation between mobile nodes is quantified with social network theory; second, we would construct the mobile nodes optimal path tree and calculate the similarity index of adjacent nodes; finally, the community dispe
APA, Harvard, Vancouver, ISO, and other styles
28

Udovychenko, Yevhenii, Anton Popov, and Illya Chaikovsky. "Binary Classification of Heart Failures Using k-NN with Various Distance Metrics." International Journal of Electronics and Telecommunications 61, no. 4 (2015): 339–44. http://dx.doi.org/10.2478/eletel-2015-0044.

Full text
Abstract:
Abstract Magnetocardiography is a sensitive technique of measuring low magnetic fields generated by heart functioning, which is used for diagnostics of large number of cardiovascular diseases. In this paper, k-nearest neighbor (k-NN)technique is used for binary classification of myocardium current density distribution maps (CDDM)from patients with negative T-peak, male and female patients with microvessels (diffuse) abnormalities and sportsmen, which are compared with normal control subjects. Number of neighbors for k-NN classifier was selected to obtain highest classification characteristics.
APA, Harvard, Vancouver, ISO, and other styles
29

Zhou, Gang, Zhongjie Han, Jin Fu, Guan Hua Xu, and Chengjin Ye. "Iterative Online Fault Identification Scheme for High-Voltage Circuit Breaker Utilizing a Lost Data Repair Technique." Energies 13, no. 13 (2020): 3311. http://dx.doi.org/10.3390/en13133311.

Full text
Abstract:
Most of the prior-art electrical noninvasive monitoring systems adopt Zigbee, Bluetooth, or other wireless communication infrastructure. These low-cost channels are often interrupted by strong electromagnetic interference and result in monitoring anomalies, particularly packet loss, which severely affects the precision of equipment fault identification. In this paper, an iterative online fault identification framework for a high-voltage circuit breaker utilizing a novel lost data repair technique is developed to adapt to low-data quality conditions. Specifically, the improved efficient k-neare
APA, Harvard, Vancouver, ISO, and other styles
30

Abou-Moustafa, Karim, and Frank P. Ferrie. "Local generalized quadratic distance metrics: application to the k-nearest neighbors classifier." Advances in Data Analysis and Classification 12, no. 2 (2017): 341–63. http://dx.doi.org/10.1007/s11634-017-0286-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Pangestu, Budi. "COMPATIBILITY OF SELECTION OF STUDENT DEPARTMENTS USING k-NEAREST NEIGHBOR AND NAÏVE BAYES CLASSIFIER IN INFORMATICS PRIVATE VOCATIONAL SCHOOL, SERANG CITY." JISA(Jurnal Informatika dan Sains) 4, no. 1 (2021): 33–39. http://dx.doi.org/10.31326/jisa.v4i1.893.

Full text
Abstract:
Selection of majors by prospective students when registering at a school, especially a Vocational High School, is very vulnerable because prospective students usually choose a major not because of their individual wishes. And because of the increasing emergence of new schools in cities and districts in each province in Indonesia, especially in the province of Banten. Problems experienced by prospective students when choosing the wrong department or not because of their desire, so that it has an unsatisfactory value or value in each semester fluctuates, especially in their Productive Lessons or
APA, Harvard, Vancouver, ISO, and other styles
32

Schwenk, Holger. "The Diabolo Classifier." Neural Computation 10, no. 8 (1998): 2175–200. http://dx.doi.org/10.1162/089976698300017025.

Full text
Abstract:
We present a new classification architecture based on autoassociative neural networks that are used to learn discriminant models of each class. The proposed architecture has several interesting properties with respect to other model-based classifiers like nearest-neighbors or radial basis functions: it has a low computational complexity and uses a compact distributed representation of the models. The classifier is also well suited for the incorporation of a priori knowledge by means of a problem-specific distance measure. In particular, we will show that tangent distance (Simard, Le Cun, &
APA, Harvard, Vancouver, ISO, and other styles
33

Anagnostopoulos, Theodoros, and Christos Skourlas. "Ensemble majority voting classifier for speech emotion recognition and prediction." Journal of Systems and Information Technology 16, no. 3 (2014): 222–32. http://dx.doi.org/10.1108/jsit-01-2014-0009.

Full text
Abstract:
Purpose – The purpose of this paper is to understand the emotional state of a human being by capturing the speech utterances that are used during common conversation. Human beings except of thinking creatures are also sentimental and emotional organisms. There are six universal basic emotions plus a neutral emotion: happiness, surprise, fear, sadness, anger, disgust and neutral. Design/methodology/approach – It is proved that, given enough acoustic evidence, the emotional state of a person can be classified by an ensemble majority voting classifier. The proposed ensemble classifier is construc
APA, Harvard, Vancouver, ISO, and other styles
34

Blachnik, Marcin, and Mirosław Kordos. "Comparison of Instance Selection and Construction Methods with Various Classifiers." Applied Sciences 10, no. 11 (2020): 3933. http://dx.doi.org/10.3390/app10113933.

Full text
Abstract:
Instance selection and construction methods were originally designed to improve the performance of the k-nearest neighbors classifier by increasing its speed and improving the classification accuracy. These goals were achieved by eliminating redundant and noisy samples, thus reducing the size of the training set. In this paper, the performance of instance selection methods is investigated in terms of classification accuracy and reduction of training set size. The classification accuracy of the following classifiers is evaluated: decision trees, random forest, Naive Bayes, linear model, support
APA, Harvard, Vancouver, ISO, and other styles
35

Dananjaya, Damar, Indah Werdiningsih, and Rini Semiati. "Decision Support System for Classification of Early Childhood Diseases Using Principal Component Analysis and K-Nearest Neighbors Classifier." Journal of Information Systems Engineering and Business Intelligence 5, no. 1 (2019): 13. http://dx.doi.org/10.20473/jisebi.5.1.13-22.

Full text
Abstract:
Background: Data on early childhood disease collected in clinics has accumulated into big data. Those data can be used for classification of early childhood diseases to help medical staff in diagnosing diseases that attack early childhoods.Objective: This study aims to apply Principal Component Analysis (PCA) and K-Nearest Neighbor (K-NN) Classifier for the classification of early childhood diseases.Methods: Data analysis was performed using PCA to obtain variables that had a major influence on the classification of early childhood diseases. PCA was done by observing the correlation between va
APA, Harvard, Vancouver, ISO, and other styles
36

Lai, Yinglei, Baolin Wu, and Hongyu Zhao. "A permutation test approach to the choice of sizekfor the nearest neighbors classifier." Journal of Applied Statistics 38, no. 10 (2011): 2289–302. http://dx.doi.org/10.1080/02664763.2010.547565.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Subbarao, M. Venkata, and P. Samundiswary. "K- Nearest Neighbors based Automatic Modulation Classifier for Next Generation Adaptive Radio Systems." International Journal of Security and Its Applications 13, no. 4 (2019): 41–50. http://dx.doi.org/10.33832/ijsia.2019.13.4.04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Abuzaraida, Mustafa Ali, Mohammed Elmehrek, and Esam Elsomadi. "Online handwriting Arabic recognition system using k-nearest neighbors classifier and DCT features." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 4 (2021): 3584. http://dx.doi.org/10.11591/ijece.v11i4.pp3584-3592.

Full text
Abstract:
With advances in machine learning techniques, handwriting recognition systems have gained a great deal of importance. Lately, the increasing popularity of handheld computers, digital notebooks, and smartphones give the field of online handwriting recognition more interest. In this paper, we propose an enhanced method for the recognition of Arabic handwriting words using a directions-based segmentation technique and discrete cosine transform (DCT) coefficients as structural features. The main contribution of this research was combining a total of 18 structural features which were extracted by D
APA, Harvard, Vancouver, ISO, and other styles
39

Nai-Arun, Nongyao, and Punnee Sittidech. "Ensemble Learning Model for Diabetes Classification." Advanced Materials Research 931-932 (May 2014): 1427–31. http://dx.doi.org/10.4028/www.scientific.net/amr.931-932.1427.

Full text
Abstract:
This paper proposed data mining techniques to improve efficiency and reliability in diabetes classification. The real data set collected from Sawanpracharak Regional Hospital, Thailand, was fist analyzed by using gain-ratio feature selection techniques. Three well known algorithms; naïve bayes, k-nearest neighbors and decision tree, were used to construct classification models on the selected features. Then, the popular ensemble learning; bagging and boosting were applied using the three base classifiers. The results revealed that the best model with the highest accuracy was bagging with base
APA, Harvard, Vancouver, ISO, and other styles
40

Haque, Rakib Ul, A. S. M. Touhidul Hasan, Qingshan Jiang, and Qiang Qu. "Privacy-Preserving K-Nearest Neighbors Training over Blockchain-Based Encrypted Health Data." Electronics 9, no. 12 (2020): 2096. http://dx.doi.org/10.3390/electronics9122096.

Full text
Abstract:
Numerous works focus on the data privacy issue of the Internet of Things (IoT) when training a supervised Machine Learning (ML) classifier. Most of the existing solutions assume that the classifier’s training data can be obtained securely from different IoT data providers. The primary concern is data privacy when training a K-Nearest Neighbour (K-NN) classifier with IoT data from various entities. This paper proposes secure K-NN, which provides a privacy-preserving K-NN training over IoT data. It employs Blockchain technology with a partial homomorphic cryptosystem (PHC) known as Paillier in o
APA, Harvard, Vancouver, ISO, and other styles
41

Gbenga*, Fadare Oluwaseun, Prof Adetunmbi Adebayo Olusola, Dr (Mrs) Oyinloye Oghenerukevwe Eloho, and Dr Mogaji Stephen Alaba. "Towards Optimization of Malware Detection using Chi-square Feature Selection on Ensemble Classifiers." International Journal of Engineering and Advanced Technology 10, no. 4 (2021): 254–62. http://dx.doi.org/10.35940/ijeat.d2359.0410421.

Full text
Abstract:
The multiplication of malware variations is probably the greatest problem in PC security and the protection of information in form of source code against unauthorized access is a central issue in computer security. In recent times, machine learning has been extensively researched for malware detection and ensemble technique has been established to be highly effective in terms of detection accuracy. This paper proposes a framework that combines combining the exploit of both Chi-square as the feature selection method and eight ensemble learning classifiers on five base learners- K-Nearest Neighb
APA, Harvard, Vancouver, ISO, and other styles
42

Gweon, Hyukjun, Matthias Schonlau, and Stefan H. Steiner. "The k conditional nearest neighbor algorithm for classification and class probability estimation." PeerJ Computer Science 5 (May 13, 2019): e194. http://dx.doi.org/10.7717/peerj-cs.194.

Full text
Abstract:
The k nearest neighbor (kNN) approach is a simple and effective nonparametric algorithm for classification. One of the drawbacks of kNN is that the method can only give coarse estimates of class probabilities, particularly for low values of k. To avoid this drawback, we propose a new nonparametric classification method based on nearest neighbors conditional on each class: the proposed approach calculates the distance between a new instance and the kth nearest neighbor from each class, estimates posterior probabilities of class memberships using the distances, and assigns the instance to the cl
APA, Harvard, Vancouver, ISO, and other styles
43

Taguelmimt, Redha, and Rachid Beghdad. "DS-kNN." International Journal of Information Security and Privacy 15, no. 2 (2021): 131–44. http://dx.doi.org/10.4018/ijisp.2021040107.

Full text
Abstract:
On one hand, there are many proposed intrusion detection systems (IDSs) in the literature. On the other hand, many studies try to deduce the important features that can best detect attacks. This paper presents a new and an easy-to-implement approach to intrusion detection, named distance sum-based k-nearest neighbors (DS-kNN), which is an improved version of k-NN classifier. Given a data sample to classify, DS-kNN computes the distance sum of the k-nearest neighbors of the data sample in each of the possible classes of the dataset. Then, the data sample is assigned to the class having the smal
APA, Harvard, Vancouver, ISO, and other styles
44

Skryjomski, Przemysław, Bartosz Krawczyk, and Alberto Cano. "Speeding up k-Nearest Neighbors classifier for large-scale multi-label learning on GPUs." Neurocomputing 354 (August 2019): 10–19. http://dx.doi.org/10.1016/j.neucom.2018.06.095.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

WANG, HUI. "SUBSEQUENCE COUNTING AS A MEASURE OF SIMILARITY FOR SEQUENCES." International Journal of Pattern Recognition and Artificial Intelligence 21, no. 04 (2007): 745–58. http://dx.doi.org/10.1142/s021800140700565x.

Full text
Abstract:
The longest common subsequence is a well known and popular method for measuring similarity between sequences. It advocates the use of information contained in the longest common subsequence as an indication of similarity. In this paper we consider the count of all common subsequences as a measure of sequence similarity with the view that all common information is captured. This measure is inspired and derived from a generic similarity measure, neighborhood counting metric. The close connection of the neighborhood counting metric with probability and the Bayes classifier helps gain an insight f
APA, Harvard, Vancouver, ISO, and other styles
46

Stiawan, Deris, Somame Morianus Daely, Ahmad Heryanto, Nurul Afifah, Mohd Yazid Idris, and Rahmat Budiarto. "Ransomware Detection Based On Opcode Behavior Using K-Nearest Neighbors Algorithm." Information Technology and Control 50, no. 3 (2021): 495–506. http://dx.doi.org/10.5755/j01.itc.50.3.25816.

Full text
Abstract:
Ransomware is a malware that represents a serious threat to a user’s information privacy. By investigating howransomware works, we may be able to recognise its atomic behaviour. In return, we will be able to detect theransomware at an earlier stage with better accuracy. In this paper, we propose Control Flow Graph (CFG) asan extracting opcode behaviour technique, combined with 4-gram (sequence of 4 “words”) to extract opcodesequence to be incorporated into Trojan Ransomware detection method using K-Nearest Neighbors (K-NN)algorithm. The opcode CFG 4-gram can fully represent the detailed behavi
APA, Harvard, Vancouver, ISO, and other styles
47

Galluzzi, Valerie, Ted Herman, D. J. Shumaker, et al. "Electronic Recognition of Hand Hygiene Technique and Duration." Infection Control & Hospital Epidemiology 35, no. 10 (2014): 1298–300. http://dx.doi.org/10.1086/678059.

Full text
Abstract:
We captured 3-dimensional accelerometry data from the wrists of 116 healthcare professionals as they performed hand hygiene (HH). We then used these data to train a k-nearest-neighbors classifier to recognize specific aspects of HH technique (ie, fingertip scrub) and measure the duration of HH events.Infect Control Hosp Epidemiol 2014;35(10):1298–1300
APA, Harvard, Vancouver, ISO, and other styles
48

Kelati, Amleset, Hossam Gaber, Juha Plosila, and Hannu Tenhunen. "Implementation of non-intrusive appliances load monitoring (NIALM) on k-nearest neighbors (k-NN) classifier." AIMS Electronics and Electrical Engineering 4, no. 3 (2020): 326–44. http://dx.doi.org/10.3934/electreng.2020.3.326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Maillo, Jesus, Salvador Garcia, Julian Luengo, Francisco Herrera, and Isaac Triguero. "Fast and Scalable Approaches to Accelerate the Fuzzy k-Nearest Neighbors Classifier for Big Data." IEEE Transactions on Fuzzy Systems 28, no. 5 (2020): 874–86. http://dx.doi.org/10.1109/tfuzz.2019.2936356.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Djellali, Hayet, Nacira Ghoualmi-Zine, and Souad Guessoum. "Hybrid adapted fast correlation FCBF-support vector machine recursive feature elimination for feature selection." Intelligent Decision Technologies 14, no. 3 (2020): 269–79. http://dx.doi.org/10.3233/idt-190014.

Full text
Abstract:
This paper investigates feature selection methods based on hybrid architecture using feature selection algorithm called Adapted Fast Correlation Based Feature selection and Support Vector Machine Recursive Feature Elimination (AFCBF-SVMRFE). The AFCBF-SVMRFE has three stages and composed of SVMRFE embedded method with Correlation based Features Selection. The first stage is the relevance analysis, the second one is a redundancy analysis, and the third stage is a performance evaluation and features restoration stage. Experiments show that the proposed method tested on different classifiers: Sup
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!