Academic literature on the topic 'Nearest neighbors classifier'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Nearest neighbors classifier.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Nearest neighbors classifier"

1

Mehta, Sumet, Xiangjun Shen, Jiangping Gou, and Dejiao Niu. "A New Nearest Centroid Neighbor Classifier Based on K Local Means Using Harmonic Mean Distance." Information 9, no. 9 (2018): 234. http://dx.doi.org/10.3390/info9090234.

Full text
Abstract:
The K-nearest neighbour classifier is very effective and simple non-parametric technique in pattern classification; however, it only considers the distance closeness, but not the geometricalplacement of the k neighbors. Also, its classification performance is highly influenced by the neighborhood size k and existing outliers. In this paper, we propose a new local mean based k-harmonic nearest centroid neighbor (LMKHNCN) classifier in orderto consider both distance-based proximity, as well as spatial distribution of k neighbors. In our method, firstly the k nearest centroid neighbors in each cl
APA, Harvard, Vancouver, ISO, and other styles
2

Widyadhana, Arya, Cornelius Bagus Purnama Putra, Rarasmaya Indraswari, and Agus Zainal Arifin. "A Bonferroni Mean Based Fuzzy K Nearest Centroid Neighbor Classifier." Jurnal Ilmu Komputer dan Informasi 14, no. 1 (2021): 65–71. http://dx.doi.org/10.21609/jiki.v14i1.959.

Full text
Abstract:
K-nearest neighbor (KNN) is an effective nonparametric classifier that determines the neighbors of a point based only on distance proximity. The classification performance of KNN is disadvantaged by the presence of outliers in small sample size datasets and its performance deteriorates on datasets with class imbalance. We propose a local Bonferroni Mean based Fuzzy K-Nearest Centroid Neighbor (BM-FKNCN) classifier that assigns class label of a query sample dependent on the nearest local centroid mean vector to better represent the underlying statistic of the dataset. The proposed classifier is
APA, Harvard, Vancouver, ISO, and other styles
3

Shaul, Hayim, Dan Feldman, and Daniela Rus. "Secure k-ish Nearest Neighbors Classifier." Proceedings on Privacy Enhancing Technologies 2020, no. 3 (2020): 42–61. http://dx.doi.org/10.2478/popets-2020-0045.

Full text
Abstract:
AbstractThe k-nearest neighbors (kNN) classifier predicts a class of a query, q, by taking the majority class of its k neighbors in an existing (already classified) database, S. In secure kNN, q and S are owned by two different parties and q is classified without sharing data. In this work we present a classifier based on kNN, that is more efficient to implement with homomorphic encryption (HE). The efficiency of our classifier comes from a relaxation we make to consider κ nearest neighbors for κ ≈k with probability that increases as the statistical distance between Gaussian and the distributi
APA, Harvard, Vancouver, ISO, and other styles
4

Onyezewe, Anozie, Armand F. Kana, Fatimah B. Abdullahi, and Aminu O. Abdulsalami. "An Enhanced Adaptive k-Nearest Neighbor Classifier Using Simulated Annealing." International Journal of Intelligent Systems and Applications 13, no. 1 (2021): 34–44. http://dx.doi.org/10.5815/ijisa.2021.01.03.

Full text
Abstract:
The k-Nearest Neighbor classifier is a non-complex and widely applied data classification algorithm which does well in real-world applications. The overall classification accuracy of the k-Nearest Neighbor algorithm largely depends on the choice of the number of nearest neighbors(k). The use of a constant k value does not always yield the best solutions especially for real-world datasets with an irregular class and density distribution of data points as it totally ignores the class and density distribution of a test point’s k-environment or neighborhood. A resolution to this problem is to dyna
APA, Harvard, Vancouver, ISO, and other styles
5

Mendes Júnior, Pedro R., Roberto M. de Souza, Rafael de O. Werneck, et al. "Nearest neighbors distance ratio open-set classifier." Machine Learning 106, no. 3 (2016): 359–86. http://dx.doi.org/10.1007/s10994-016-5610-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Puchkin, Nikita, and Vladimir Spokoiny. "An adaptive multiclass nearest neighbor classifier." ESAIM: Probability and Statistics 24 (2020): 69–99. http://dx.doi.org/10.1051/ps/2019021.

Full text
Abstract:
We consider a problem of multiclass classification, where the training sample S_n={(Xi,Yi)}ni=1 is generated from the model ℙ(Y = m|X = x) = ηm(x), 1 ≤ m ≤ M, and η1(x), …, ηM(x) are unknown α-Holder continuous functions. Given a test point X, our goal is to predict its label. A widely used k-nearest-neighbors classifier constructs estimates of η1(X), …, ηM(X) and uses a plug-in rule for the prediction. However, it requires a proper choice of the smoothing parameter k, which may become tricky in some situations. We fix several integers n1, …, nK, compute corresponding nk-nearest-neighbor estim
APA, Harvard, Vancouver, ISO, and other styles
7

MITICHE, A., and J. K. AGGARWAL. "PATTERN CATEGORY ASSIGNMENT BY NEURAL NETWORKS AND NEAREST NEIGHBORS RULE: A SYNOPSIS AND A CHARACTERIZATION." International Journal of Pattern Recognition and Artificial Intelligence 10, no. 05 (1996): 393–408. http://dx.doi.org/10.1142/s0218001496000268.

Full text
Abstract:
The purpose of this paper is two-fold: to give a synoptic description of favored neural networks and to characterize the potency of these neural networks as pattern classifiers, against the background of the familiar nearest neighbors classification. We limit the study to those neural network structures most commonly used for pattern classification: the multilayer perceptron, the Kohonen associative memory, and the Carpenter–Grossberg clustering network, for which we give a tutorial description with the aim of making the driving concepts apparent. The nearest neighbors rule is presented with i
APA, Harvard, Vancouver, ISO, and other styles
8

Roh, Seok-Beom, and Tae-Chon Ahn. "Design of Lazy Classifier based on Fuzzy k-Nearest Neighbors and Reconstruction Error." Journal of Korean Institute of Intelligent Systems 20, no. 1 (2010): 101–8. http://dx.doi.org/10.5391/jkiis.2010.20.1.101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

WEN, Zhi-qiang, Yong-xiang HU, and Wen-qiu ZHU. "k-nearest neighbors classifier over manifolds." Journal of Computer Applications 32, no. 12 (2013): 3311–14. http://dx.doi.org/10.3724/sp.j.1087.2012.03311.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ati, Indri, and Ari Kusyanti. "Metode Ensemble Classifier untuk Mendeteksi Jenis Attention Deficit Hyperactivity Disorder (SDHD) pada Anak Usia Dini." Jurnal Teknologi Informasi dan Ilmu Komputer 6, no. 3 (2019): 301. http://dx.doi.org/10.25126/jtiik.2019631313.

Full text
Abstract:
<p class="Abstract">Pada awal masa perkembangan, beberapa anak mengalami hambatan diantaranya sulit untuk diam, sulit untuk berkonsentrasi dan mengontrol perilakunya, apabila anak mengalami gangguan pemusatan perhatian dan sulit mengontrol perilaku yang sesuai, dapat disebut dengan ADHD (Attention Deficit Hyperactive Disorder). Ini merupakan masalah yang serius dikarenakan anak penyandang ADHD mengalami masalah perilaku sosial, emosional dan mengalami kesulitan belajar sekolah sehingga akan mempengaruhi perkembangan pada masa dewasa anak penyandang ADHD. Oleh karena itu perlu diketahui g
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Nearest neighbors classifier"

1

Bernardina, Philipe Dalla. "PCA-tree: uma proposta para indexação multidimensional." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-29082007-114522/.

Full text
Abstract:
Com o vislumbramento de aplicações que exigiam representações em espaços multidimensionais, surgiu a necessidade de desenvolvimento de métodos de acessos eficientes a estes dados representados em R^d. Dentre as aplicações precursoras dos métodos de acessos multidimensionais, podemos citar os sistemas de geoprocessamento, aplicativos 3D e simuladores. Posteriormente, os métodos de acessos multidimensionais também apresentaram-se como uma importante ferramenta no projeto de classificadores, principalmente classificadores pelos vizinhos mais próximos. Com isso, expandiu-se o espaço de representaç
APA, Harvard, Vancouver, ISO, and other styles
2

Gupta, Nidhi. "Mutual k Nearest Neighbor based Classifier." University of Cincinnati / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1289937369.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bermejo, Sánchez Sergio. "Learning with nearest neighbour classifiers." Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6323.

Full text
Abstract:
Premi extraordinari ex-aequo en l'àmbit d'Electrònica i Telecomunicacions. Convocatoria 1999 - 2000<br>Nearest Neighbour (NN) classifiers are one of the most celebrated algorithms in machine learning. In recent years, interest in these methods has flourished again in several fields (including statistics, machine learning and pattern recognition) since, in spite of their simplicity, they reveal as powerful non-parametric classification systems in real-world problems. The present work is mainly devoted to the development of new learning algorithms for these classifiers and is focused on the foll
APA, Harvard, Vancouver, ISO, and other styles
4

Reeder, John. "Hilbert Space Filling Curve (HSFC) Nearest Neighbor Classifier." Honors in the Major Thesis, University of Central Florida, 2005. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/794.

Full text
Abstract:
This item is only available in print in the UCF Libraries. If this is your Honors Thesis, you can help us make it available online for use by researchers around the world by following the instructions on the distribution consent form at http://library.ucf<br>Bachelors<br>Engineering and Computer Science<br>Computer Engineering
APA, Harvard, Vancouver, ISO, and other styles
5

Kumar, Raja [Verfasser]. "Reducing the Computational Requirements of the Nearest Neighbor Classifier / Raja Kumar." München : GRIN Verlag, 2019. http://d-nb.info/1193490804/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

LOPES, Marcus Vinicius de Sousa. "Aplicação de classificadores para determinação de conformidade de biodiesel." Universidade Federal do Maranhão, 2017. http://tedebc.ufma.br:8080/jspui/handle/tede/1896.

Full text
Abstract:
Submitted by Rosivalda Pereira (mrs.pereira@ufma.br) on 2017-09-04T17:47:07Z No. of bitstreams: 1 MarcusLopes.pdf: 2085041 bytes, checksum: 14f6f9bbe0d5b050a23103874af8c783 (MD5)<br>Made available in DSpace on 2017-09-04T17:47:07Z (GMT). No. of bitstreams: 1 MarcusLopes.pdf: 2085041 bytes, checksum: 14f6f9bbe0d5b050a23103874af8c783 (MD5) Previous issue date: 2017-07-26<br>The growing demand for energy and the limitations of oil reserves have led to the search for renewable and sustainable energy sources to replace, even partially, fossil fuels. Biodiesel has become in last decades the ma
APA, Harvard, Vancouver, ISO, and other styles
7

Mestre, Ricardo Jorge Palheira. "Improvements on the KNN classifier." Master's thesis, Faculdade de Ciências e Tecnologia, 2013. http://hdl.handle.net/10362/10923.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática<br>The object classification is an important area within the artificial intelligence and its application extends to various areas, whether or not in the branch of science. Among the other classifiers, the K-nearest neighbor (KNN) is among the most simple and accurate especially in environments where the data distribution is unknown or apparently not parameterizable. This algorithm assigns the classifying element the major class in the K nearest neighbors. According to the original algorithm, this classification implies the
APA, Harvard, Vancouver, ISO, and other styles
8

Hatko, Stan. "k-Nearest Neighbour Classification of Datasets with a Family of Distances." Thesis, Université d'Ottawa / University of Ottawa, 2015. http://hdl.handle.net/10393/33361.

Full text
Abstract:
The k-nearest neighbour (k-NN) classifier is one of the oldest and most important supervised learning algorithms for classifying datasets. Traditionally the Euclidean norm is used as the distance for the k-NN classifier. In this thesis we investigate the use of alternative distances for the k-NN classifier. We start by introducing some background notions in statistical machine learning. We define the k-NN classifier and discuss Stone's theorem and the proof that k-NN is universally consistent on the normed space R^d. We then prove that k-NN is universally consistent if we take a sequence of
APA, Harvard, Vancouver, ISO, and other styles
9

Neo, TohKoon. "A Direct Algorithm for the K-Nearest-Neighbor Classifier via Local Warping of the Distance Metric." Diss., CLICK HERE for online access, 2007. http://contentdm.lib.byu.edu/ETD/image/etd2168.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Neo, Toh Koon Charlie. "A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric /." Diss., CLICK HERE for online access, 2007. http://contentdm.lib.byu.edu/ETD/image/etd2168.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Nearest neighbors classifier"

1

Baillo, Amparo, Antonio Cuevas, and Ricardo Fraiman. Classification methods for functional data. Edited by Frédéric Ferraty and Yves Romain. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199568444.013.10.

Full text
Abstract:
This article reviews the literature concerning supervised and unsupervised classification of functional data. It first explains the meaning of unsupervised classification vs. supervised classification before discussing the supervised classification problem in the infinite-dimensional case, showing that its formal statement generally coincides with that of discriminant analysis in the classical multivariate case. It then considers the optimal classifier and plug-in rules, empirical risk and empirical minimization rules, linear discrimination rules, the k nearest neighbor (k-NN) method, and kern
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Nearest neighbors classifier"

1

Ayad, Hanan, and Mohamed Kamel. "Finding Natural Clusters Using Multi-clusterer Combiner Based on Shared Nearest Neighbors." In Multiple Classifier Systems. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/3-540-44938-8_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shen, Xiang-Jun, Wen-Chao Zhang, Wei Cai, et al. "Building Locally Discriminative Classifier Ensemble Through Classifier Fusion Among Nearest Neighbors." In Lecture Notes in Computer Science. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-48890-5_21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mendialdua, I., B. Sierra, E. Lazkano, I. Irigoien, and E. Jauregi. "Surrounding Influenced K-Nearest Neighbors: A New Distance Based Classifier." In Advanced Data Mining and Applications. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-17316-5_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hoque, Nazrul, Dhruba K. Bhattacharyya, and Jugal K. Kalita. "KNN-DK: A Modified K-NN Classifier with Dynamic k Nearest Neighbors." In Advances in Applications of Data-Driven Computing. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-33-6919-1_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sokołowska, Beata, Teresa Sadura-Sieklucka, Leszek Czerwosz, Marta Hallay-Suszek, Bogdan Lesyng, and Krystyna Księżopolska-Orłowska. "Estimation of Posturographic Trajectory Using k-Nearest Neighbors Classifier in Patients with Rheumatoid Arthritis and Osteoarthritis." In Advances in Experimental Medicine and Biology. Springer International Publishing, 2018. http://dx.doi.org/10.1007/5584_2018_150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gupta, Subhash Chandra, and Noopur Goel. "Selection of Best K of K-Nearest Neighbors Classifier for Enhancement of Performance for the Prediction of Diabetes." In Advances in Intelligent Systems and Computing. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-33-4299-6_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhou, Yonglei, Changshui Zhang, and Jingchun Wang. "Tunable Nearest Neighbor Classifier." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-28649-3_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kubat, Miroslav. "Similarities: Nearest-Neighbor Classifiers." In An Introduction to Machine Learning. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-63913-0_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kubat, Miroslav. "Similarities: Nearest-Neighbor Classifiers." In An Introduction to Machine Learning. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20010-1_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kubat, Miroslav. "Similarities: Nearest-Neighbor Classifiers." In An Introduction to Machine Learning. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81935-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Nearest neighbors classifier"

1

Derrac, Joaquin, Francisco Chiclana, Salvador Garcia, and Francisco Herrera. "An Interval Valued K-Nearest Neighbors Classifier." In 2015 Conference of the International Fuzzy Systems Association and the European Society for Fuzzy Logic and Technology (IFSA-EUSFLAT-15). Atlantis Press, 2015. http://dx.doi.org/10.2991/ifsa-eusflat-15.2015.55.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yunlong Gao, Jin-yan Pan, and Feng Gao. "Improved boosting algorithm through weighted k-nearest neighbors classifier." In 2010 3rd IEEE International Conference on Computer Science and Information Technology (ICCSIT 2010). IEEE, 2010. http://dx.doi.org/10.1109/iccsit.2010.5563551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yu, Xiao-gao, and Xiao-peng Yu. "The Research on An Adaptive K-Nearest Neighbors Classifier." In Proceedings of 2006 International Conference on Machine Learning and Cybernetics. IEEE, 2006. http://dx.doi.org/10.1109/icmlc.2006.258646.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Yu, Xiaopeng, and Xiaogao yu. "The Research on an Adaptive k-Nearest Neighbors Classifier." In 2006 5th IEEE International Conference on Cognitive Informatics. IEEE, 2006. http://dx.doi.org/10.1109/coginf.2006.365542.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Giri, Animesh, M. Vignesh V. Bhagavath, Bysani Pruthvi, and Naini Dubey. "A Placement Prediction System using k-nearest neighbors classifier." In 2016 Second International Conference on Cognitive Computing and Information Processing (CCIP). IEEE, 2016. http://dx.doi.org/10.1109/ccip.2016.7802883.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Allogba, Stephanie, and Christine Tremblay. "K-Nearest Neighbors Classifier for Field Bit Error Rate Data." In 2018 Asia Communications and Photonics Conference (ACP). IEEE, 2018. http://dx.doi.org/10.1109/acp.2018.8596133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shih, Yu-Hsin, and Chuan-Kang Ting. "Evolutionary Optimization on k-Nearest Neighbors Classifier for Imbalanced Datasets." In 2019 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2019. http://dx.doi.org/10.1109/cec.2019.8789921.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kuo, B. C., H. H. Ho, T. W. Sheu, and S. C. Shih. "A Novel K Nearest Neighbors Classifier Based on Nonparametric Separability." In 2006 IEEE International Symposium on Geoscience and Remote Sensing. IEEE, 2006. http://dx.doi.org/10.1109/igarss.2006.704.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Hui, and Chuanyan Zhang. "A Web entity activity recognition approach based on k-nearest neighbors classifier." In 2012 2nd International Conference on Consumer Electronics, Communications and Networks (CECNet). IEEE, 2012. http://dx.doi.org/10.1109/cecnet.2012.6202019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rachmawanto, Eko Hari, Christy Atika Sari, Rivalda Villadelfiya, et al. "Eggs Classification based on Egg Shell Image using K-Nearest Neighbors Classifier." In 2020 International Seminar on Application for Technology of Information and Communication (iSemantic). IEEE, 2020. http://dx.doi.org/10.1109/isemantic50169.2020.9234305.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Nearest neighbors classifier"

1

Buhl, M. R., G. A. Clark, J. V. Candy, and G. H. Thomas. Detection of ``single-leg separated`` heart valves using statistical pattern recognition with the nearest neighbor classifier. Office of Scientific and Technical Information (OSTI), 1993. http://dx.doi.org/10.2172/10177333.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Buhl, M. R., G. A. Clark, J. V. Candy, and G. H. Thomas. Detection of ``single-leg separated`` heart valves using statistical pattern recognition with the nearest neighbor classifier. Revision 1. Office of Scientific and Technical Information (OSTI), 1993. http://dx.doi.org/10.2172/10117041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Barnes, Christopher F. Feasibility Studies of Nearest Neighbor Residual Vector Quantizer Classifiers for a Collection of Signal and Sensor Waveforms. Defense Technical Information Center, 1997. http://dx.doi.org/10.21236/ada319600.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Barnes, Christopher F., and Byron M. Keel. Feasibility Studies of Nearest Neighbor Residual Vector Quantizer Classifiers for a Collection of Signal and Sensor Waveforms: Automatic Target Recognition in SAR Images. Defense Technical Information Center, 1998. http://dx.doi.org/10.21236/ada333408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!