Academic literature on the topic 'KNN classification'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'KNN classification.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "KNN classification"

1

Gweon, Hyukjun, Matthias Schonlau, and Stefan H. Steiner. "The k conditional nearest neighbor algorithm for classification and class probability estimation." PeerJ Computer Science 5 (May 13, 2019): e194. http://dx.doi.org/10.7717/peerj-cs.194.

Full text
Abstract:
The k nearest neighbor (kNN) approach is a simple and effective nonparametric algorithm for classification. One of the drawbacks of kNN is that the method can only give coarse estimates of class probabilities, particularly for low values of k. To avoid this drawback, we propose a new nonparametric classification method based on nearest neighbors conditional on each class: the proposed approach calculates the distance between a new instance and the kth nearest neighbor from each class, estimates posterior probabilities of class memberships using the distances, and assigns the instance to the cl
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Shichao. "Cost-sensitive KNN classification." Neurocomputing 391 (May 2020): 234–42. http://dx.doi.org/10.1016/j.neucom.2018.11.101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, Puning, and Lifeng Lai. "Efficient Classification with Adaptive KNN." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (2021): 11007–14. http://dx.doi.org/10.1609/aaai.v35i12.17314.

Full text
Abstract:
In this paper, we propose an adaptive kNN method for classification, in which different k are selected for different test samples. Our selection rule is easy to implement since it is completely adaptive and does not require any knowledge of the underlying distribution. The convergence rate of the risk of this classifier to the Bayes risk is shown to be minimax optimal for various settings. Moreover, under some special assumptions, the convergence rate is especially fast and does not decay with the increase of dimensionality.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Shichao, Xuelong Li, Ming Zong, Xiaofeng Zhu, and Debo Cheng. "Learning k for kNN Classification." ACM Transactions on Intelligent Systems and Technology 8, no. 3 (2017): 1–19. http://dx.doi.org/10.1145/2990508.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Khairina, Nurul, Theofil Tri Saputra Sibarani, Rizki Muliono, Zulfikar Sembiring, and Muhathir Muhathir. "Identification of Pneumonia using The K-Nearest Neighbors Method using HOG Fitur Feature Extraction." JOURNAL OF INFORMATICS AND TELECOMMUNICATION ENGINEERING 5, no. 2 (2022): 562–68. http://dx.doi.org/10.31289/jite.v5i2.6216.

Full text
Abstract:
Pneumonia is a wet lung disease. Pneumonia is generally caused by viruses, bacteria or fungi. Not infrequently Pneumonia can cause death. The K-Nearest Neighbors method is a classification method that uses the majority value from the closest k value category. At this time people are not too worried about pneumonia because this pneumonia has symptoms like a normal cough. However, fast and accurate information from health experts is also very necessary so that pneumonia symptoms can be recognized early and how to deal with them can also be done faster. In this study, researchers will diagnose pn
APA, Harvard, Vancouver, ISO, and other styles
6

Raeisi Shahraki, Hadi, Saeedeh Pourahmad, and Najaf Zare. "K Important Neighbors: A Novel Approach to Binary Classification in High Dimensional Data." BioMed Research International 2017 (2017): 1–9. http://dx.doi.org/10.1155/2017/7560807.

Full text
Abstract:
K nearest neighbors (KNN) are known as one of the simplest nonparametric classifiers but in high dimensional setting accuracy of KNN are affected by nuisance features. In this study, we proposed the K important neighbors (KIN) as a novel approach for binary classification in high dimensional problems. To avoid the curse of dimensionality, we implemented smoothly clipped absolute deviation (SCAD) logistic regression at the initial stage and considered the importance of each feature in construction of dissimilarity measure with imposing features contribution as a function of SCAD coefficients on
APA, Harvard, Vancouver, ISO, and other styles
7

Yang, Zhida, Peng Liu, and Yi Yang. "Convective/Stratiform Precipitation Classification Using Ground-Based Doppler Radar Data Based on the K-Nearest Neighbor Algorithm." Remote Sensing 11, no. 19 (2019): 2277. http://dx.doi.org/10.3390/rs11192277.

Full text
Abstract:
Stratiform and convective rain types are associated with different cloud physical processes, vertical structures, thermodynamic influences and precipitation types. Distinguishing convective and stratiform systems is beneficial to meteorology research and weather forecasting. However, there is no clear boundary between stratiform and convective precipitation. In this study, a machine learning algorithm, K-nearest neighbor (KNN), is used to classify precipitation types. Six Doppler radar (WSR-98D/SA) data sets from Jiangsu, Guangzhou and Anhui Provinces in China were used as training and classif
APA, Harvard, Vancouver, ISO, and other styles
8

Lu, Jiaxuan, and Hyukjun Gweon. "Random k conditional nearest neighbor for high-dimensional data." PeerJ Computer Science 11 (January 24, 2025): e2497. https://doi.org/10.7717/peerj-cs.2497.

Full text
Abstract:
The k nearest neighbor (kNN) approach is a simple and effective algorithm for classification and a number of variants have been proposed based on the kNN algorithm. One of the limitations of kNN is that the method may be less effective when data contains many noisy features due to their non-informative influence in calculating distance. Additionally, information derived from nearest neighbors may be less meaningful in high-dimensional data. To address the limitation of nearest-neighbor based approaches in high-dimensional data, we propose to extend the k conditional nearest neighbor (kCNN) met
APA, Harvard, Vancouver, ISO, and other styles
9

Su, Yixin, and Sheng-Uei Guan. "Density and Distance Based KNN Approach to Classification." International Journal of Applied Evolutionary Computation 7, no. 2 (2016): 45–60. http://dx.doi.org/10.4018/ijaec.2016040103.

Full text
Abstract:
KNN algorithm is a simple and efficient algorithm developed to solve classification problems. However, it encounters problems when classifying datasets with non-uniform density distributions. The existing KNN voting mechanism may lose essential information by considering majority only and get degraded performance when a dataset has uneven distribution. The other drawback comes from the way that KNN treat all the participating candidates equally when judging upon one test datum. To overcome the weaknesses of KNN, a Region of Influence Based KNN (RI-KNN) is proposed. RI-KNN computes for each tra
APA, Harvard, Vancouver, ISO, and other styles
10

Ganatra, Dr Dhimant. "Improving classification accuracy :The KNN approach." International Journal of Advanced Trends in Computer Science and Engineering 9, no. 4 (2020): 6147–50. http://dx.doi.org/10.30534/ijatcse/2020/287942020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "KNN classification"

1

Mestre, Ricardo Jorge Palheira. "Improvements on the KNN classifier." Master's thesis, Faculdade de Ciências e Tecnologia, 2013. http://hdl.handle.net/10362/10923.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática<br>The object classification is an important area within the artificial intelligence and its application extends to various areas, whether or not in the branch of science. Among the other classifiers, the K-nearest neighbor (KNN) is among the most simple and accurate especially in environments where the data distribution is unknown or apparently not parameterizable. This algorithm assigns the classifying element the major class in the K nearest neighbors. According to the original algorithm, this classification implies the
APA, Harvard, Vancouver, ISO, and other styles
2

Hanson, Sarah Elizabeth. "Classification of ADHD Using Heterogeneity Classes and Attention Network Task Timing." Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/83610.

Full text
Abstract:
Throughout the 1990s ADHD diagnosis and medication rates have increased rapidly, and this trend continues today. These sharp increases have been met with both public and clinical criticism, detractors stating over-diagnosis is a problem and healthy children are being unnecessarily medicated and labeled as disabled. However, others say that ADHD is being under-diagnosed in some populations. Critics often state that there are multiple factors that introduce subjectivity into the diagnosis process, meaning that a final diagnosis may be influenced by more than the desire to protect a patient's wel
APA, Harvard, Vancouver, ISO, and other styles
3

Bel, Haj Ali Wafa. "Minimisation de fonctions de perte calibrée pour la classification des images." Phd thesis, Université Nice Sophia Antipolis, 2013. http://tel.archives-ouvertes.fr/tel-00934062.

Full text
Abstract:
La classification des images est aujourd'hui un défi d'une grande ampleur puisque ça concerne d'un côté les millions voir des milliards d'images qui se trouvent partout sur le web et d'autre part des images pour des applications temps réel critiques. Cette classification fait appel en général à des méthodes d'apprentissage et à des classifieurs qui doivent répondre à la fois à la précision ainsi qu'à la rapidité. Ces problèmes d'apprentissage touchent aujourd'hui un grand nombre de domaines d'applications: à savoir, le web (profiling, ciblage, réseaux sociaux, moteurs de recherche), les "Big D
APA, Harvard, Vancouver, ISO, and other styles
4

Lopez, Marcano Juan L. "Classification of ADHD and non-ADHD Using AR Models and Machine Learning Algorithms." Thesis, Virginia Tech, 2016. http://hdl.handle.net/10919/73688.

Full text
Abstract:
As of 2016, diagnosis of ADHD in the US is controversial. Diagnosis of ADHD is based on subjective observations, and treatment is usually done through stimulants, which can have negative side-effects in the long term. Evidence shows that the probability of diagnosing a child with ADHD not only depends on the observations of parents, teachers, and behavioral scientists, but also on state-level special education policies. In light of these facts, unbiased, quantitative methods are needed for the diagnosis of ADHD. This problem has been tackled since the 1990s, and has resulted in methods that ha
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Sichu. "Application of Machine Learning Techniques for Real-time Classification of Sensor Array Data." ScholarWorks@UNO, 2009. http://scholarworks.uno.edu/td/913.

Full text
Abstract:
There is a significant need to identify approaches for classifying chemical sensor array data with high success rates that would enhance sensor detection capabilities. The present study attempts to fill this need by investigating six machine learning methods to classify a dataset collected using a chemical sensor array: K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Classification and Regression Trees (CART), Random Forest (RF), Naïve Bayes Classifier (NB), and Principal Component Regression (PCR). A total of 10 predictors that are associated with the response from 10 sensor channels
APA, Harvard, Vancouver, ISO, and other styles
6

Do, Cao Tri. "Apprentissage de métrique temporelle multi-modale et multi-échelle pour la classification robuste de séries temporelles par plus proches voisins." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAM028/document.

Full text
Abstract:
La définition d'une métrique entre des séries temporelles est un élément important pour de nombreuses tâches en analyse ou en fouille de données, tel que le clustering, la classification ou la prédiction. Les séries temporelles présentent naturellement différentes caractéristiques, que nous appelons modalités, sur lesquelles elles peuvent être comparées, comme leurs valeurs, leurs formes ou leurs contenus fréquentielles. Ces caractéristiques peuvent être exprimées avec des délais variables et à différentes granularités ou localisations temporelles - exprimées globalement ou localement. Combine
APA, Harvard, Vancouver, ISO, and other styles
7

Villa, Medina Joe Luis. "Reliability of classification and prediction in k-nearest neighbours." Doctoral thesis, Universitat Rovira i Virgili, 2013. http://hdl.handle.net/10803/127108.

Full text
Abstract:
En esta tesis doctoral seha desarrollado el cálculo de la fiabilidad de clasificación y de la fiabilidad de predicción utilizando el método de los k-vecinos más cercanos (k-nearest neighbours, kNN) y estrategias de remuestreo basadas en bootstrap. Se han desarrollado, además, dos nuevos métodos de clasificación:Probabilistic Bootstrapk-Nearest Neighbours (PBkNN) y Bagged k-Nearest Neighbours (BaggedkNN),yun nuevo método de predicción,el Direct OrthogonalizationkNN (DOkNN).En todos los casos, los resultados obtenidos con los nuevos métodos han sido comparables o mejores que los obtenidos utiliz
APA, Harvard, Vancouver, ISO, and other styles
8

Ozsakabasi, Feray. "Classification Of Forest Areas By K Nearest Neighbor Method: Case Study, Antalya." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609548/index.pdf.

Full text
Abstract:
Among the various remote sensing methods that can be used to map forest areas, the K Nearest Neighbor (KNN) supervised classification method is becoming increasingly popular for creating forest inventories in some countries. In this study, the utility of the KNN algorithm is evaluated for forest/non-forest/water stratification. Antalya is selected as the study area. The data used are composed of Landsat TM and Landsat ETM satellite images, acquired in 1987 and 2002, respectively, SRTM 90 meters digital elevation model (DEM) and land use data from the year 2003. The accuracies of different modi
APA, Harvard, Vancouver, ISO, and other styles
9

Joseph, Katherine Amanda. "Comparison of Segment and Pixel Based Non-Parametric Classification of Land Cover in the Amazon Region of Brazil Using Multitemporal Landsat TM/ETM+ Imagery." Thesis, Virginia Tech, 2005. http://hdl.handle.net/10919/32802.

Full text
Abstract:
This study evaluated the ability of segment-based classification paired with non-parametric methods (CART and kNN) to classify a chronosequence of Landsat TM/ETM+ imagery spanning from 1992 to 2002 within the state of Rondônia, Brazil. Pixel-based classification was also implemented for comparison. Interannual multitemporal composites were used in each classification in an attempt to increase the separation of primary forest, cleared, and re-vegetated classes within a given year. The kNN and CART classification methods, with the integration of multitemporal data, performed equally well wit
APA, Harvard, Vancouver, ISO, and other styles
10

Buani, Bruna Elisa Zanchetta. "Aplicação da Lógica Fuzzy kNN e análises estatísticas para seleção de características e classificação de abelhas." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/3/3141/tde-10012011-085835/.

Full text
Abstract:
Este trabalho propõe uma alternativa para o problema de classificação de espécies de abelhas a partir da implementação de um algoritmo com base na Morfométria Geométrica e estudo das Formas dos marcos anatômicos das imagens obtidas pelas asas das abelhas. O algoritmo implementado para este propósito se baseia no algoritmo dos k-Vizinho mais Próximos (do inglês, kNN) e na Lógica Fuzzy kNN (Fuzzy k-Nearest Neighbor) aplicados a dados analisados e selecionados de pontos bidimensionais referentes as características geradas por marcos anatômicos. O estudo apresentado envolve métodos de seleção e or
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "KNN classification"

1

Chōsakai, Kanagawa-ken Shokubutsushi. Kanagawa-ken shokubutsushi 1988. Kanagawa Kenritsu Hakubutsukan, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhongguo tu shu guan tu shu fen lei fa bian ji wei yuan hui. Zhongguo tu shu guan tu shu fen lei fa: Qi kan fen lei fa. Shu mu wen xian chu ban she, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kankyōka, Shimane-ken (Japan) Shizen. Kaitei Shimane reddo dēta bukku 2013: Shimane-ken no zetsumetsu no osore no aru yasei shokubutsu : Shokubutsu-hen = Shimane red data book 2013. Shimane-ken Kankyō Seikatsubu Shizen Kankyōka, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kankyōka, Shimane-ken (Japan) Shizen. Kaitei Shimane reddo dēta bukku 2014: Shimane-ken no zetsumetsu no osore no aru yasei dōbutsu : Dōbutsu hen = Shimane red data book 2014. Shimane-ken Kankyō Seikatsubu Shizen Kankyōka, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rajaguru, Harikumar. Knn Classifier and K-Means Clustering for Robust Classification of Epilepsy from Eeg Signals. a Detailed Analysis. Anchor Academic Publishing. ein Imprint der Diplomica Verlag GmbH, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Scheffler, Harold W. Australian Kin Classification. Cambridge University Press, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Scheffler, Harold W. Australian Kin Classification. Cambridge University Press, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Scheffler, Harold W. Australian Kin Classification (Cambridge Studies in Social and Cultural Anthropology). Cambridge University Press, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Taisetsu ni shitai Nara-ken no yasei dōshokubutsu: Nara-kenban reddo dēta bukku : 2008. Nara-ken Nōrinbu Shinrin Hozenka, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

"Zhongguo tu shu guan tu shu fen lei fa, qi kan fen lei biao" shi yong zhi nan. Beijing tu shu guan chu ban she, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "KNN classification"

1

Kamath, Surekha. "Sleep Apnea Classification Using KNN." In Lecture Notes in Networks and Systems. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-5412-0_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ishii, Naohiro, Tsuyoshi Murai, Takahiro Yamada, and Yongguang Bao. "Classification by Weighting, Similarity and kNN." In Intelligent Data Engineering and Automated Learning – IDEAL 2006. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11875581_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Guo, Gongde, Hui Wang, David Bell, Yaxin Bi, and Kieran Greer. "KNN Model-Based Approach in Classification." In On The Move to Meaningful Internet Systems 2003: CoopIS, DOA, and ODBASE. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39964-3_62.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhuang, Jiaxin, Jiabin Cai, Ruixuan Wang, Jianguo Zhang, and Wei-Shi Zheng. "Deep kNN for Medical Image Classification." In Medical Image Computing and Computer Assisted Intervention – MICCAI 2020. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59710-8_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ishii, Naohiro, Yuichi Morioka, Hiroaki Kimura, and Yongguang Bao. "Classification by Multiple Reducts-kNN with Confidence." In Intelligent Data Engineering and Automated Learning – IDEAL 2010. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15381-5_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ashai, Mariyam, Rhea Gautam Mukherjee, Sanjana P. Mundharikar, Vinayak Dev Kuanr, and R. Harikrishnan. "Classification of Astronomical Objects using KNN Algorithm." In Smart Intelligent Computing and Applications, Volume 1. Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9669-5_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gaće, Marin, Tomislav Galba, Alfonzo Baumgartner, and Časlav Livada. "Dataset Ratio Influence on kNN Classification Results." In Lecture Notes in Networks and Systems. Springer Nature Switzerland, 2024. https://doi.org/10.1007/978-3-031-80597-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Costa, Bruno G., Jean Carlos Arouche Freire, Hamilton S. Cavalcante, et al. "Fault Classification on Transmission Lines Using KNN-DTW." In Computational Science and Its Applications – ICCSA 2017. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-62392-4_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Beryl Princess, P. Joyce, Salaja Silas, and Elijah Blessing Rajsingh. "Classification of Road Accidents Using SVM and KNN." In Advances in Intelligent Systems and Computing. Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-3514-7_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Orczyk, Tomasz, Rafal Doroz, and Piotr Porwik. "Combined kNN Classifier for Classification of Incomplete Data." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-19738-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "KNN classification"

1

Santhosh Kumar, B. N., and G. N. K. Suresh Babu. "Ocular Disease Identification and Classification Using LBP - KNN." In 2024 International Conference on Knowledge Engineering and Communication Systems (ICKECS). IEEE, 2024. http://dx.doi.org/10.1109/ickecs61492.2024.10617177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Musgrave, John, and Anca Ralescu. "kNN Classification of Malware Data Dependency Graph Features." In NAECON 2024 - IEEE National Aerospace and Electronics Conference. IEEE, 2024. http://dx.doi.org/10.1109/naecon61878.2024.10670673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yamsani, Nagendar, Deepak Banerjee, Suraj Thakur, and Mohammed I. Habelalmateen. "Revolutionizing Gemstone Classification with Hybrid CNN-KNN Models." In 2025 IEEE International Conference on Interdisciplinary Approaches in Technology and Management for Social Innovation (IATMSI). IEEE, 2025. https://doi.org/10.1109/iatmsi64286.2025.10985167.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Camargo, Catalina, Elsa Cruz, Camila Gordillo, Angie Rangel, and Manuel Franco. "Comparison of KNN and SVM Methods for Melanoma Classification." In 2024 3rd International Congress of Biomedical Engineering and Bioengineering (CIIBBI). IEEE, 2024. https://doi.org/10.1109/ciibbi63846.2024.10784961.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Banerjee, Deepak. "Robust Car Damage Classification using CNN and KNN Algorithms." In 2024 Second International Conference on Intelligent Cyber Physical Systems and Internet of Things (ICoICI). IEEE, 2024. http://dx.doi.org/10.1109/icoici62503.2024.10696430.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Deivasikamani, Ganeshkumar, Akshay C, Ananthakrishnan T, and Rohith C. Manoj. "Covid Cough Classification using KNN Classification Algorithm." In 2022 International Conference on Applied Artificial Intelligence and Computing (ICAAIC). IEEE, 2022. http://dx.doi.org/10.1109/icaaic53929.2022.9793198.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Zonghu, and Zhijing Liu. "Graph-based KNN text classification." In 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD). IEEE, 2010. http://dx.doi.org/10.1109/fskd.2010.5569866.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pichardo-Morales, Francisco D., Marco A. Acevedo-Mosqueda, and Sandra L. Gomez-Coronel. "Classification of Gunshots with KNN Classifier." In EATIS '18: Euro American Conference on Telematics and Information Systems. ACM, 2018. http://dx.doi.org/10.1145/3293614.3293656.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Anagnostou, Panagiotis, Petros Barbas, Aristidis G. Vrahatis, and Sotiris K. Tasoulis. "Approximate kNN Classification for Biomedical Data." In 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020. http://dx.doi.org/10.1109/bigdata50022.2020.9378126.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Thejaswini, B. M., T. Y. Satheesha, and Sathish Bhairannawar. "EEG Classification Using Modified KNN Algorithm." In 2023 International Conference on Applied Intelligence and Sustainable Computing (ICAISC). IEEE, 2023. http://dx.doi.org/10.1109/icaisc58445.2023.10200104.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "KNN classification"

1

Rodriguez, Ian. Hybrid SVM-KNN Model for Urban Air Quality Classification in Jakarta. ResearchHub Technologies, Inc., 2025. https://doi.org/10.55277/researchhub.l4t0ynwl.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ferdaus, Md Meftahul, Mahdi Abdelguerfi, Elias Ioup, et al. KANICE : Kolmogorov-Arnold networks with interactive convolutional elements. Engineer Research and Development Center (U.S.), 2025. https://doi.org/10.21079/11681/49791.

Full text
Abstract:
We introduce KANICE, a novel neural architecture that combines Convolutional Neural Networks (CNNs) with Kolmogorov-Arnold Network (KAN) principles. KANICE integrates Interactive Convolutional Blocks (ICBs) and KAN linear layers into a CNN framework. This leverages KANs’ universal approximation capabilities and ICBs’ adaptive feature learning. KANICE captures complex, non-linear data relationships while enabling dynamic, context-dependent feature extraction based on the Kolmogorov-Arnold representation theorem. We evaluated KANICE on four datasets: MNIST, Fashion-MNIST, EMNIST, and SVHN, compa
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!