Academic literature on the topic 'K-nearest-neighbor'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'K-nearest-neighbor.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "K-nearest-neighbor"

1

Peterson, Leif. "K-nearest neighbor." Scholarpedia 4, no. 2 (2009): 1883. http://dx.doi.org/10.4249/scholarpedia.1883.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wilujeng, Dian Tri, Mohamat Fatekurohman, and I. Made Tirta. "Analisis Risiko Kredit Perbankan Menggunakan Algoritma K-Nearest Neighbor dan Nearest Weighted K-Nearest Neighbor." Indonesian Journal of Applied Statistics 5, no. 2 (2023): 142. http://dx.doi.org/10.13057/ijas.v5i2.58426.

Full text
Abstract:
<p dir="ltr"><span>Bank is a business entity that collects public funds in the form of savings and also distributes them to the public in the form of credit or other forms. Credit risk analysis can be done in various ways such as marketing analysis and big data using machine learning. One example of a machine learning algorithm is K-Nearest Neighbor (KNN) and the development of the K-Nearest Neighbor algorithm is Neighbor Weighted KNearest Neighbor (NWKNN). The K-Nearest Neighbor (KNN) algorithm is one of the machine learning methods that can be used to facilitate the classificatio
APA, Harvard, Vancouver, ISO, and other styles
3

Ertuğrul, Ömer Faruk, and Mehmet Emin Tağluk. "A novel version of k nearest neighbor: Dependent nearest neighbor." Applied Soft Computing 55 (June 2017): 480–90. http://dx.doi.org/10.1016/j.asoc.2017.02.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Syaliman, Khairul Umam, M. Zulfahmi, and Aldi Abdillah Nababan. "Perbandingan Rapid Centroid Estimation (RCE) — K Nearest Neighbor (K-NN) Dengan K Means — K Nearest Neighbor (K-NN)." InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 2, no. 1 (2017): 79–89. http://dx.doi.org/10.30743/infotekjar.v2i1.166.

Full text
Abstract:
Teknik Clustering terbukti dapat meningkatkan akurasi dalam melakukan klasifikasi, terutama pada algoritma K-Nearest Neighbor (K-NN). Setiap data dari setiap kelas akan membentuk K cluster yang kemudian nilai centroid akhir dari setiap cluster pada setiap kelas data tersebut akan dijadikan data acuan untuk melakukan proses klasifikasi menggunakan algoritma K-NN. Namun kendala dari banyaknya teknik clustering adalah biaya komputasi yang mahal, Rapid Centroid Estimation (RCE) dan K-Means termasuk kedalam teknik clustering dengan biaya komputasi yang murah. Untuk melihat manakah dari kedua algori
APA, Harvard, Vancouver, ISO, and other styles
5

Wu, Yingquan, Krassimir Ianakiev, and Venu Govindaraju. "Improved k-nearest neighbor classification." Pattern Recognition 35, no. 10 (2002): 2311–18. http://dx.doi.org/10.1016/s0031-3203(01)00132-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chung, Yu-Chi, I.-Fang Su, Chiang Lee, and Pei-Chi Liu. "Multiple k nearest neighbor search." World Wide Web 20, no. 2 (2016): 371–98. http://dx.doi.org/10.1007/s11280-016-0392-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yu, Zhiwen, Hantao Chen, Jiming Liuxs, Jane You, Hareton Leung, and Guoqiang Han. "Hybrid $k$ -Nearest Neighbor Classifier." IEEE Transactions on Cybernetics 46, no. 6 (2016): 1263–75. http://dx.doi.org/10.1109/tcyb.2015.2443857.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bezdek, James C., Siew K. Chuah, and David Leep. "Generalized k-nearest neighbor rules." Fuzzy Sets and Systems 18, no. 3 (1986): 237–56. http://dx.doi.org/10.1016/0165-0114(86)90004-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lee, Heesung. "K-Nearest Neighbor rule using Distance Information Fusion." Journal of Korean Institute of Intelligent Systems 28, no. 2 (2018): 160–63. http://dx.doi.org/10.5391/jkiis.2018.28.2.160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Algamal, Zakariya, Shaimaa Mahmood, and Ghalia Basheer. "Classification of Chronic Kidney Disease Data via Three Algorithms." Journal of Al-Rafidain University College For Sciences ( Print ISSN: 1681-6870 ,Online ISSN: 2790-2293 ), no. 1 (October 1, 2021): 414–20. http://dx.doi.org/10.55562/jrucs.v46i1.92.

Full text
Abstract:
Pattern recognition can be defined as the classification of data based on knowledge already gained or on statistical information extracted from patterns. The classification of objects is an important area for research and application in a variety of fields. In this paper, k-Nearest Neighbor, Fuzzy k-Nearest Neighbor and Modified k-Nearest Neighbor algorithms are used to classify of the chronic kidney disease (CKD) data with different choices of value k. The experiment results prove that the Fuzzy k-Nearest Neighbor and Modified k-Nearest Neighbor algorithms are very effective for classifying C
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "K-nearest-neighbor"

1

Gupta, Nidhi. "Mutual k Nearest Neighbor based Classifier." University of Cincinnati / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1289937369.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Olsson, Christoffer. "Love Thy Neighbor : The Connectivity of the k-nearest Neighbor Graph." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-149639.

Full text
Abstract:
The topic of this thesis is the connectivity of the k-nearest neighbor random geometric graph model. The main result is an expository proof of the fact that there is a critical constant for connectivity. In addition to this, other results related to the connectivity of the k-nearest neighbor model, as well as the closely related Gilbert disc model, are discussed.<br>Denna uppsats fördjupar sig i "the k-nearest neighbor graph", en geometrisk slumpgraf, och när den är sammanhängande. Huvudresultatet är ett förklarande bevis av att det finns en kritisk konstant för egenskapen att vara sammanhänga
APA, Harvard, Vancouver, ISO, and other styles
3

Dixit, Siddharth. "Density Based Clustering using Mutual K-Nearest Neighbors." University of Cincinnati / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1447690719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chanzy, Philippe. "Range search and nearest neighbor search in k-d trees." Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=68164.

Full text
Abstract:
This thesis presents an analysis of the expected complexity of range searching and nearest neighbor searching in random 2-d trees. We show that range searching in a random rectangle $ Delta sb{x} times Delta sb{y}$ can be done in $O lbrack Delta sb{x} Delta sb{y} n+( Delta sb{x}+ Delta sb{y}) n sp alpha +{ rm ln} n rbrack$ expected time. A matching lower bound is also obtained. We also show that nearest neighbor searching in random 2-d trees by any algorithm must take time bounded by $ Omega lbrack n sp{ alpha-1/2}/({ rm ln} n) sp alpha rbrack$ where $ alpha=( sqrt{17}-3)/2$. This disproves a
APA, Harvard, Vancouver, ISO, and other styles
5

Kuhlman, Caitlin Anne. "Pivot-based Data Partitioning for Distributed k Nearest Neighbor Mining." Digital WPI, 2017. https://digitalcommons.wpi.edu/etd-theses/1212.

Full text
Abstract:
This thesis addresses the need for a scalable distributed solution for k-nearest-neighbor (kNN) search, a fundamental data mining task. This unsupervised method poses particular challenges on shared-nothing distributed architectures, where global information about the dataset is not available to individual machines. The distance to search for neighbors is not known a priori, and therefore a dynamic data partitioning strategy is required to guarantee that exact kNN can be found autonomously on each machine. Pivot-based partitioning has been shown to facilitate bounding of partitions, however st
APA, Harvard, Vancouver, ISO, and other styles
6

Wong, Wing Sing. "K-nearest-neighbor queries with non-spatial predicates on range attributes /." View abstract or full-text, 2005. http://library.ust.hk/cgi/db/thesis.pl?COMP%202005%20WONGW.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

CALENDER, CHRISTOPHER R. "APPROXIMATE N-NEAREST NEIGHBOR CLUSTERING ON DISTRIBUTED DATABASES USING ITERATIVE REFINEMENT." University of Cincinnati / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1092929952.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Aikes, Junior Jorge. "Estudo da influência de diversas medidas de similaridade na previsão de séries temporais utilizando o algoritmo KNN-TSP." Universidade Estadual do Oeste do Parana, 2012. http://tede.unioeste.br:8080/tede/handle/tede/1084.

Full text
Abstract:
Made available in DSpace on 2017-07-10T17:11:50Z (GMT). No. of bitstreams: 1 JORGE AIKES JUNIOR.PDF: 2050278 bytes, checksum: f5bae18bbcb7465240488c45b2c813e7 (MD5) Previous issue date: 2012-04-11<br>Time series can be understood as any set of observations which are time ordered. Among the many possible tasks appliable to temporal data, one that has attracted increasing interest, due to its various applications, is the time series forecasting. The k-Nearest Neighbor - Time Series Prediction (kNN-TSP) algorithm is a non-parametric method for forecasting time series. One of its advantages, i
APA, Harvard, Vancouver, ISO, and other styles
9

Ozsakabasi, Feray. "Classification Of Forest Areas By K Nearest Neighbor Method: Case Study, Antalya." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609548/index.pdf.

Full text
Abstract:
Among the various remote sensing methods that can be used to map forest areas, the K Nearest Neighbor (KNN) supervised classification method is becoming increasingly popular for creating forest inventories in some countries. In this study, the utility of the KNN algorithm is evaluated for forest/non-forest/water stratification. Antalya is selected as the study area. The data used are composed of Landsat TM and Landsat ETM satellite images, acquired in 1987 and 2002, respectively, SRTM 90 meters digital elevation model (DEM) and land use data from the year 2003. The accuracies of different modi
APA, Harvard, Vancouver, ISO, and other styles
10

Forsberg, Tom-Henrik, and Johan Sundström. "A comparative study of k nearest neighbor computations on GPUs and CPUs." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208362.

Full text
Abstract:
The problem of nearest neighbors search arises in many areas of computer science. This search could see a performance boost if implemented on a GPU, utilizing its multithreading capabilities, compared to the sequential execution on a CPU. The purpose of this report is to invesigate how a simple GPU implementation of k nearest neighbors search compares to other common algorithms on a CPU. A brute force search on the GPU was implemented using NVIDIAs CUDA platform, and its performance was tested and compared to implementations from an existing library of both exact brute force search and approxi
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "K-nearest-neighbor"

1

Sidhu, Jagpreet, and Arvinder Kaur. Natural Language Processing. a Machine Learning Approach to Sense Tagged Words Using K-Nearest Neighbor. GRIN Verlag GmbH, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Baillo, Amparo, Antonio Cuevas, and Ricardo Fraiman. Classification methods for functional data. Edited by Frédéric Ferraty and Yves Romain. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199568444.013.10.

Full text
Abstract:
This article reviews the literature concerning supervised and unsupervised classification of functional data. It first explains the meaning of unsupervised classification vs. supervised classification before discussing the supervised classification problem in the infinite-dimensional case, showing that its formal statement generally coincides with that of discriminant analysis in the classical multivariate case. It then considers the optimal classifier and plug-in rules, empirical risk and empirical minimization rules, linear discrimination rules, the k nearest neighbor (k-NN) method, and kern
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "K-nearest-neighbor"

1

Li, Hang. "K-Nearest Neighbor." In Machine Learning Methods. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3917-6_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Huang, Xiaowei, Gaojie Jin, and Wenjie Ruan. "K-Nearest Neighbor." In Artificial Intelligence: Foundations, Theory, and Algorithms. Springer Nature Singapore, 2012. http://dx.doi.org/10.1007/978-981-19-6814-3_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shekhar, Shashi, and Hui Xiong. "K‑Nearest Neighbor Query." In Encyclopedia of GIS. Springer US, 2008. http://dx.doi.org/10.1007/978-0-387-35973-1_669.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mucherino, Antonio, Petraq J. Papajorgji, and Panos M. Pardalos. "k-Nearest Neighbor Classification." In Data Mining in Agriculture. Springer New York, 2009. http://dx.doi.org/10.1007/978-0-387-88615-2_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Selle, Stefan. "Modellierung mit k-Nearest Neighbor." In Data Science Training - Supervised Learning. Springer Berlin Heidelberg, 2024. https://doi.org/10.1007/978-3-662-67960-9_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Steele, Brian, John Chandler, and Swarna Reddy. "k-Nearest Neighbor Prediction Functions." In Algorithms for Data Science. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-45797-0_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chávez, Edgar, and Eric Sadit Tellez. "Navigating K-Nearest Neighbor Graphs to Solve Nearest Neighbor Searches." In Advances in Pattern Recognition. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15992-3_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Borgohain, Olimpia, Meghna Dasgupta, Piyush Kumar, and Gitimoni Talukdar. "Performance Analysis of Nearest Neighbor, K-Nearest Neighbor and Weighted K-Nearest Neighbor for the Classification of Alzheimer Disease." In Advances in Intelligent Systems and Computing. Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-7394-1_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wu, Yingquan, Krasimir G. Ianakiev, and Venu Govindaraju. "Improvements in K-Nearest Neighbor Classification." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44732-6_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Emrich, Tobias, Hans-Peter Kriegel, Peer Kröger, Johannes Niedermayer, Matthias Renz, and Andreas Züfle. "Reverse-k-Nearest-Neighbor Join Processing." In Advances in Spatial and Temporal Databases. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-40235-7_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "K-nearest-neighbor"

1

Li, Xinze, Hanan Al-Tous, Salah Eddine Hajri, and Olav Tirkkonen. "Enhanced Weighted K-Nearest Neighbor Positioning." In 2024 IEEE 99th Vehicular Technology Conference (VTC2024-Spring). IEEE, 2024. http://dx.doi.org/10.1109/vtc2024-spring62846.2024.10683493.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

NONG, JIAN, Jian Xu, Xi He, Wenge Li, and Hongben Huang. "Octree k-nearest neighbor parallel query." In Fourth International Conference on Computer Vision and Pattern Analysis (ICCPA 2024), edited by Ji Zhao and Yonghui Yang. SPIE, 2024. http://dx.doi.org/10.1117/12.3037976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

B, Ramya, Abirami S, Ramya M, and Kulothunga Rajan R. "Fruit Calorie Measurement using K-Nearest Neighbor Algorithm." In 2024 3rd International Conference on Automation, Computing and Renewable Systems (ICACRS). IEEE, 2024. https://doi.org/10.1109/icacrs62842.2024.10841546.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Hongqiu, Jinghui Yang, Qianyun Zhu, et al. "KNN (K-nearest neighbor) for G-LOC Prediction." In 2024 3rd International Conference on Health Big Data and Intelligent Healthcare (ICHIH). IEEE, 2024. https://doi.org/10.1109/ichih63459.2024.11064745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Stap, David, and Christof Monz. "Multilingual k-Nearest-Neighbor Machine Translation-Nearest-Neighbor Machine Translation." In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.emnlp-main.571.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bicego, M., and M. Loog. "Weighted K-Nearest Neighbor revisited." In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7899872.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Saarinen, Sirpa, and George Cybenko. "Approximate k-nearest neighbor method." In Optical Engineering and Photonics in Aerospace Sensing, edited by Firooz A. Sadjadi. SPIE, 1993. http://dx.doi.org/10.1117/12.150579.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Okfalisa, Ikbal Gazalba, Mustakim, and Nurul Gayatri Indah Reza. "Comparative analysis of k-nearest neighbor and modified k-nearest neighbor algorithm for data classification." In 2017 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE). IEEE, 2017. http://dx.doi.org/10.1109/icitisee.2017.8285514.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Dexin, Kai Fan, Boxing Chen, and Deyi Xiong. "Efficient Cluster-Based k-Nearest-Neighbor Machine Translation-Nearest-Neighbor Machine Translation." In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.acl-long.154.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gao, Yunjun, Baihua Zheng, Gencai Chen, Wang-Chien Lee, Ken C. K. Lee, and Qing Li. "Visible Reverse k-Nearest Neighbor Queries." In 2009 IEEE 25th International Conference on Data Engineering (ICDE). IEEE, 2009. http://dx.doi.org/10.1109/icde.2009.201.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "K-nearest-neighbor"

1

Han, Euihong, George Karypis, and Vipin Kumar. Text Categorization Using Weight Adjusted k-Nearest Neighbor Classification. Defense Technical Information Center, 1999. http://dx.doi.org/10.21236/ada439688.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Searcy, Stephen W., and Kalman Peleg. Adaptive Sorting of Fresh Produce. United States Department of Agriculture, 1993. http://dx.doi.org/10.32747/1993.7568747.bard.

Full text
Abstract:
This project includes two main parts: Development of a “Selective Wavelength Imaging Sensor” and an “Adaptive Classifiery System” for adaptive imaging and sorting of agricultural products respectively. Three different technologies were investigated for building a selectable wavelength imaging sensor: diffraction gratings, tunable filters and linear variable filters. Each technology was analyzed and evaluated as the basis for implementing the adaptive sensor. Acousto optic tunable filters were found to be most suitable for the selective wavelength imaging sensor. Consequently, a selectable wave
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!