To see the other types of publications on this topic, follow the link: Ensemble Based Classification.

Journal articles on the topic 'Ensemble Based Classification'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Ensemble Based Classification.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Gui, Wenli, Liping Jing, Liu Yang, and Jian Yu. "Unsupervised Cross-Language Classification with Stratified Sampling-Based Cluster Ensemble." International Journal of Machine Learning and Computing 5, no. 3 (June 2015): 165–71. http://dx.doi.org/10.7763/ijmlc.2015.v5.502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jurek, Anna, Yaxin Bi, Shengli Wu, and Chris Nugent. "A survey of commonly used ensemble-based classification techniques." Knowledge Engineering Review 29, no. 5 (May 3, 2013): 551–81. http://dx.doi.org/10.1017/s0269888913000155.

Full text
Abstract:
AbstractThe combination of multiple classifiers, commonly referred to as a classifier ensemble, has previously demonstrated the ability to improve classification accuracy in many application domains. As a result this area has attracted significant amount of research in recent years. The aim of this paper has therefore been to provide a state of the art review of the most well-known ensemble techniques with the main focus on bagging, boosting and stacking and to trace the recent attempts, which have been made to improve their performance. Within this paper, we present and compare an updated view on the different modifications of these techniques, which have specifically aimed to address some of the drawbacks of these methods namely the low diversity problem in bagging or the over-fitting problem in boosting. In addition, we provide a review of different ensemble selection methods based on both static and dynamic approaches. We present some new directions which have been adopted in the area of classifier ensembles from a range of recently published studies. In order to provide a deeper insight into the ensembles themselves a range of existing theoretical studies have been reviewed in the paper.
APA, Harvard, Vancouver, ISO, and other styles
3

Kilimci, Zeynep H., and Selim Akyokus. "Deep Learning- and Word Embedding-Based Heterogeneous Classifier Ensembles for Text Classification." Complexity 2018 (October 9, 2018): 1–10. http://dx.doi.org/10.1155/2018/7130146.

Full text
Abstract:
The use of ensemble learning, deep learning, and effective document representation methods is currently some of the most common trends to improve the overall accuracy of a text classification/categorization system. Ensemble learning is an approach to raise the overall accuracy of a classification system by utilizing multiple classifiers. Deep learning-based methods provide better results in many applications when compared with the other conventional machine learning algorithms. Word embeddings enable representation of words learned from a corpus as vectors that provide a mapping of words with similar meaning to have similar representation. In this study, we use different document representations with the benefit of word embeddings and an ensemble of base classifiers for text classification. The ensemble of base classifiers includes traditional machine learning algorithms such as naïve Bayes, support vector machine, and random forest and a deep learning-based conventional network classifier. We analysed the classification accuracy of different document representations by employing an ensemble of classifiers on eight different datasets. Experimental results demonstrate that the usage of heterogeneous ensembles together with deep learning methods and word embeddings enhances the classification performance of texts.
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Bo, Yu Kai Yao, Xiao Ping Wang, and Xiao Yun Chen. "PB-SVM Ensemble: A SVM Ensemble Algorithm Based on SVM." Applied Mechanics and Materials 701-702 (December 2014): 58–62. http://dx.doi.org/10.4028/www.scientific.net/amm.701-702.58.

Full text
Abstract:
As one of the most popular and effective classification algorithms, Support Vector Machine (SVM) has attracted much attention in recent years. Classifiers ensemble is a research direction in machine learning and statistics, it often gives a higher classification accuracy than the single classifier. This paper proposes a new ensemble algorithm based on SVM. The proposed classification algorithm PB-SVM Ensemble consists of some SVM classifiers produced by PCAenSVM and fifty classifiers trained using Bagging, the results are combined to make the final decision on testing set using majority voting. The performance of PB-SVM Ensemble are evaluated on six datasets which are from UCI repository, Statlog or the famous research. The results of the experiment are compared with LibSVM, PCAenSVM and Bagging. PB-SVM Ensemble outperform other three algorithms in classification accuracy, and at the same time keep a higher confidence of accuracy than Bagging.
APA, Harvard, Vancouver, ISO, and other styles
5

Alsawalqah, Hamad, Neveen Hijazi, Mohammed Eshtay, Hossam Faris, Ahmed Al Radaideh, Ibrahim Aljarah, and Yazan Alshamaileh. "Software Defect Prediction Using Heterogeneous Ensemble Classification Based on Segmented Patterns." Applied Sciences 10, no. 5 (March 3, 2020): 1745. http://dx.doi.org/10.3390/app10051745.

Full text
Abstract:
Software defect prediction is a promising approach aiming to improve software quality and testing efficiency by providing timely identification of defect-prone software modules before the actual testing process begins. These prediction results help software developers to effectively allocate their limited resources to the modules that are more prone to defects. In this paper, a hybrid heterogeneous ensemble approach is proposed for the purpose of software defect prediction. Heterogeneous ensembles consist of set of classifiers of different learning base methods in which each of them has its own strengths and weaknesses. The main idea of the proposed approach is to develop expert and robust heterogeneous classification models. Two versions of the proposed approach are developed and experimented. The first is based on simple classifiers, and the second is based on ensemble ones. For evaluation, 21 publicly available benchmark datasets are selected to conduct the experiments and benchmark the proposed approach. The evaluation results show the superiority of the ensemble version over other well-regarded basic and ensemble classifiers.
APA, Harvard, Vancouver, ISO, and other styles
6

KO, ALBERT HUNG-REN, ROBERT SABOURIN, and ALCEU DE SOUZA BRITTO. "COMPOUND DIVERSITY FUNCTIONS FOR ENSEMBLE SELECTION." International Journal of Pattern Recognition and Artificial Intelligence 23, no. 04 (June 2009): 659–86. http://dx.doi.org/10.1142/s021800140900734x.

Full text
Abstract:
An effective way to improve a classification method's performance is to create ensembles of classifiers. Two elements are believed to be important in constructing an ensemble: (a) the performance of each individual classifier; and (b) diversity among the classifiers. Nevertheless, most works based on diversity suggest that there exists only weak correlation between classifier performance and ensemble accuracy. We propose compound diversity functions which combine the diversities with the performance of each individual classifier, and show that there is a strong correlation between the proposed functions and ensemble accuracy. Calculation of the correlations with different ensemble creation methods, different problems and different classification algorithms on 0.624 million ensembles suggests that most compound diversity functions are better than traditional diversity measures. The population-based Genetic Algorithm was used to search for the best ensembles on a handwritten numerals recognition problem and to evaluate 42.24 million ensembles. The statistical results indicate that compound diversity functions perform better than traditional diversity measures, and are helpful in selecting the best ensembles.
APA, Harvard, Vancouver, ISO, and other styles
7

Alizadeh Moghaddam, S. H., M. Mokhtarzade, and S. A. Alizadeh Moghaddam. "A NEW MULTIPLE CLASSIFIER SYSTEM BASED ON A PSO ALGORITHM FOR THE CLASSIFICATION OF HYPERSPECTRAL IMAGES." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W18 (October 18, 2019): 71–75. http://dx.doi.org/10.5194/isprs-archives-xlii-4-w18-71-2019.

Full text
Abstract:
Abstract. Multiple classifier systems (MCSs) have shown great performance for the classification of hyperspectral images. The requirements for a successful MCS are 1) diversity between ensembles and 2) good classification accuracy of each ensemble. In this paper, we develop a new MCS method based on a particle swarm optimization (PSO) algorithm. Firstly, in each ensemble of the proposed method, called PSO-MCS, PSO identifies a subset of the spectral bands with a high J2 value, which is a measure of class-separability. Then, an SVM classifier is used to classify the input image, applying the selected features in each ensemble. Finally, the classification results of the entire ensembles are integrated using a majority voting strategy. Having the benefit of the PSO algorithm, PSO-MCS selects appropriate features. In addition, due to the fact that different features are selected in different runs of PSO, diversity between the ensembles is provided. Experimental results on an AVIRIS Indian Pine image show the superiority of the proposed method over its competitor, named random feature selection method.
APA, Harvard, Vancouver, ISO, and other styles
8

Hu, Ruihan, Songbin Zhou, Yisen Liu, and Zhiri Tang. "Margin-Based Pareto Ensemble Pruning: An Ensemble Pruning Algorithm That Learns to Search Optimized Ensembles." Computational Intelligence and Neuroscience 2019 (June 3, 2019): 1–12. http://dx.doi.org/10.1155/2019/7560872.

Full text
Abstract:
The ensemble pruning system is an effective machine learning framework that combines several learners as experts to classify a test set. Generally, ensemble pruning systems aim to define a region of competence based on the validation set to select the most competent ensembles from the ensemble pool with respect to the test set. However, the size of the ensemble pool is usually fixed, and the performance of an ensemble pool heavily depends on the definition of the region of competence. In this paper, a dynamic pruning framework called margin-based Pareto ensemble pruning is proposed for ensemble pruning systems. The framework explores the optimized ensemble pool size during the overproduction stage and finetunes the experts during the pruning stage. The Pareto optimization algorithm is used to explore the size of the overproduction ensemble pool that can result in better performance. Considering the information entropy of the learners in the indecision region, the marginal criterion for each learner in the ensemble pool is calculated using margin criterion pruning, which prunes the experts with respect to the test set. The effectiveness of the proposed method for classification tasks is assessed using datasets. The results show that margin-based Pareto ensemble pruning can achieve smaller ensemble sizes and better classification performance in most datasets when compared with state-of-the-art models.
APA, Harvard, Vancouver, ISO, and other styles
9

Onan, Aytug. "Hybrid supervised clustering based ensemble scheme for text classification." Kybernetes 46, no. 2 (February 6, 2017): 330–48. http://dx.doi.org/10.1108/k-10-2016-0300.

Full text
Abstract:
Purpose The immense quantity of available unstructured text documents serve as one of the largest source of information. Text classification can be an essential task for many purposes in information retrieval, such as document organization, text filtering and sentiment analysis. Ensemble learning has been extensively studied to construct efficient text classification schemes with higher predictive performance and generalization ability. The purpose of this paper is to provide diversity among the classification algorithms of ensemble, which is a key issue in the ensemble design. Design/methodology/approach An ensemble scheme based on hybrid supervised clustering is presented for text classification. In the presented scheme, supervised hybrid clustering, which is based on cuckoo search algorithm and k-means, is introduced to partition the data samples of each class into clusters so that training subsets with higher diversities can be provided. Each classifier is trained on the diversified training subsets and the predictions of individual classifiers are combined by the majority voting rule. The predictive performance of the proposed classifier ensemble is compared to conventional classification algorithms (such as Naïve Bayes, logistic regression, support vector machines and C4.5 algorithm) and ensemble learning methods (such as AdaBoost, bagging and random subspace) using 11 text benchmarks. Findings The experimental results indicate that the presented classifier ensemble outperforms the conventional classification algorithms and ensemble learning methods for text classification. Originality/value The presented ensemble scheme is the first to use supervised clustering to obtain diverse ensemble for text classification
APA, Harvard, Vancouver, ISO, and other styles
10

Ku Abd. Rahim, Ku, I. Elamvazuthi, Lila Izhar, and Genci Capi. "Classification of Human Daily Activities Using Ensemble Methods Based on Smartphone Inertial Sensors." Sensors 18, no. 12 (November 26, 2018): 4132. http://dx.doi.org/10.3390/s18124132.

Full text
Abstract:
Increasing interest in analyzing human gait using various wearable sensors, which is known as Human Activity Recognition (HAR), can be found in recent research. Sensors such as accelerometers and gyroscopes are widely used in HAR. Recently, high interest has been shown in the use of wearable sensors in numerous applications such as rehabilitation, computer games, animation, filmmaking, and biomechanics. In this paper, classification of human daily activities using Ensemble Methods based on data acquired from smartphone inertial sensors involving about 30 subjects with six different activities is discussed. The six daily activities are walking, walking upstairs, walking downstairs, sitting, standing and lying. It involved three stages of activity recognition; namely, data signal processing (filtering and segmentation), feature extraction and classification. Five types of ensemble classifiers utilized are Bagging, Adaboost, Rotation forest, Ensembles of nested dichotomies (END) and Random subspace. These ensemble classifiers employed Support vector machine (SVM) and Random forest (RF) as the base learners of the ensemble classifiers. The data classification is evaluated with the holdout and 10-fold cross-validation evaluation methods. The performance of each human daily activity was measured in terms of precision, recall, F-measure, and receiver operating characteristic (ROC) curve. In addition, the performance is also measured based on the comparison of overall accuracy rate of classification between different ensemble classifiers and base learners. It was observed that overall, SVM produced better accuracy rate with 99.22% compared to RF with 97.91% based on a random subspace ensemble classifier.
APA, Harvard, Vancouver, ISO, and other styles
11

Yıldırım, Pelin, Ulaş K. Birant, and Derya Birant. "EBOC: Ensemble-Based Ordinal Classification in Transportation." Journal of Advanced Transportation 2019 (March 24, 2019): 1–17. http://dx.doi.org/10.1155/2019/7482138.

Full text
Abstract:
Learning the latent patterns of historical data in an efficient way to model the behaviour of a system is a major need for making right decisions. For this purpose, machine learning solution has already begun its promising marks in transportation as well as in many areas such as marketing, finance, education, and health. However, many classification algorithms in the literature assume that the target attribute values in the datasets are unordered, so they lose inherent order between the class values. To overcome the problem, this study proposes a novel ensemble-based ordinal classification (EBOC) approach which suggests bagging and boosting (AdaBoost algorithm) methods as a solution for ordinal classification problem in transportation sector. This article also compares the proposed EBOC approach with ordinal class classifier and traditional tree-based classification algorithms (i.e., C4.5 decision tree, RandomTree, and REPTree) in terms of accuracy. The results indicate that the proposed EBOC approach achieves better classification performance than the conventional solutions.
APA, Harvard, Vancouver, ISO, and other styles
12

De Bock, Koen W., Kristof Coussement, and Dirk Van den Poel. "Ensemble classification based on generalized additive models." Computational Statistics & Data Analysis 54, no. 6 (June 2010): 1535–46. http://dx.doi.org/10.1016/j.csda.2009.12.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Duan, Rui, Sabah Mohammed, and Jinan Fiaidhi. "Ensemble Methods for ECG-Based Heartbeat Classification." International Journal of Control and Automation 12, no. 4 (April 30, 2019): 29–46. http://dx.doi.org/10.33832/ijca.2019.12.4.03.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Rathor, Sandeep, and R. S. Jadon. "Acoustic domain classification and recognition through ensemble based multilevel classification." Journal of Ambient Intelligence and Humanized Computing 10, no. 9 (October 11, 2018): 3617–27. http://dx.doi.org/10.1007/s12652-018-1087-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Osareh, Alireza, and Bita Shadgar. "An Efficient Ensemble Learning Method for Gene Microarray Classification." BioMed Research International 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/478410.

Full text
Abstract:
The gene microarray analysis and classification have demonstrated an effective way for the effective diagnosis of diseases and cancers. However, it has been also revealed that the basic classification techniques have intrinsic drawbacks in achieving accurate gene classification and cancer diagnosis. On the other hand, classifier ensembles have received increasing attention in various applications. Here, we address the gene classification issue using RotBoost ensemble methodology. This method is a combination of Rotation Forest and AdaBoost techniques which in turn preserve both desirable features of an ensemble architecture, that is, accuracy and diversity. To select a concise subset of informative genes, 5 different feature selection algorithms are considered. To assess the efficiency of the RotBoost, other nonensemble/ensemble techniques including Decision Trees, Support Vector Machines, Rotation Forest, AdaBoost, and Bagging are also deployed. Experimental results have revealed that the combination of the fast correlation-based feature selection method with ICA-based RotBoost ensemble is highly effective for gene classification. In fact, the proposed method can create ensemble classifiers which outperform not only the classifiers produced by the conventional machine learning but also the classifiers generated by two widely used conventional ensemble learning methods, that is, Bagging and AdaBoost.
APA, Harvard, Vancouver, ISO, and other styles
16

Alzami, Farrikh, Aries Jehan Tamamy, Ricardus Anggi Pramunendar, and Zaenal Arifin. "FUSION OF BAGGING BASED ENSEMBLE FRAMEWORK FOR EPILEPTIC SEIZURE CLASSIFICATION." Transmisi 22, no. 3 (August 17, 2020): 102–6. http://dx.doi.org/10.14710/transmisi.22.3.102-106.

Full text
Abstract:
The ensemble learning approach, especially in classification, has been widely carried out and is successful in many scopes, but unfortunately not many ensemble approaches are used for the detection and classification of epilepsy in biomedical terms. Compared to using a simple bagging ensemble framework, we propose a fusion bagging-based ensemble framework (FBEF) that uses 3 weak learners in each oracle, using fusion rules, a weak learner will give results as predictors of the oracle. All oracle predictors will be included in the trust factor to get a better prediction and classification. Compared to traditional Ensemble bagging and single learner type Ensemble bagging, our framework outperforms similar research in relation to the epileptic seizure classification as 98.11±0.68 and several real-world datasets
APA, Harvard, Vancouver, ISO, and other styles
17

Wei, Yan Yan, and Tao Sheng Li. "An Empirical Study on Feature Subsampling-Based Ensembles." Applied Mechanics and Materials 239-240 (December 2012): 848–52. http://dx.doi.org/10.4028/www.scientific.net/amm.239-240.848.

Full text
Abstract:
Feature subsampling techniques help to create diverse for classifiers ensemble. In this article we investigate two feature subsampling-base ensemble methods - Random Subspace Method (RSM) and Rotation Forest Method (RFM) to explore their usability with different learning algorithms and the robust on noise data. The experiments show that RSM with IBK work better than RFM and AdaBoost, and RFM with tree classifier and rule classifier achieve prominent improvement than others. We also find that Logistic algorithm is not suitable for any of the three ensembles. When adding classification noise into original data sets, ensembles outperform singles at lower noisy level but fail to maintain such superior at higher noisy level.
APA, Harvard, Vancouver, ISO, and other styles
18

Liu, Kun-Hong, Muchenxuan Tong, Shu-Tong Xie, and Vincent To Yee Ng. "Genetic Programming Based Ensemble System for Microarray Data Classification." Computational and Mathematical Methods in Medicine 2015 (2015): 1–11. http://dx.doi.org/10.1155/2015/193406.

Full text
Abstract:
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.
APA, Harvard, Vancouver, ISO, and other styles
19

ZHANG, BAILING. "RELIABLE IMAGE CLASSIFICATION BY COMBINING FEATURES AND RANDOM SUBSPACE SUPPORT VECTOR MACHINE ENSEMBLE." International Journal of Pattern Recognition and Artificial Intelligence 28, no. 03 (May 2014): 1450005. http://dx.doi.org/10.1142/s0218001414500050.

Full text
Abstract:
We investigate the implementation of image categorization algorithms with a reject option, as a mean to enhance the system reliability and to attain a higher classification accuracy. A reject option is desired in many image-classification applications for which the system should abstain from making decisions on the most uncertain images. Based on the random subspace (RS) ensemble learning model, a highly reliable image classification scheme is proposed by applying RS support vector machine (SVM) ensemble. Being different to previous classifier ensembles which focus on increasing classification accuracy exclusively, the objective of the proposed SVM ensemble is to provide classification confidence and implement reject option to accommodate the situations where no decision should be made. The ensemble is created with four different feature descriptions, including local binary pattern (LBP), pyramid histogram of oriented gradient (PHOG), Gabor filtering and curvelet transform. The consensus degree from the ensemble's voting conforms to the confidence measure and the rejection option is accomplished accordingly when the confidence falls below a threshold. The reliable recognition scheme is empirically evaluated on three image categorization benchmark databases, including the face database created by Aleix Martinez and Robert Benavente (AR faces), a subset of Caltech-101 images for object classification, and 15 natural scene categories, all of which yielded consistently high reliable results, thus demonstrating the effectiveness of the proposed approach. For example, a 99.9% accuracy was obtained with a rejection rate of 2.5% for the AR faces, which exhibit promising potentials for real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
20

Zhang, Peiming. "Ensemble Classification Restricted Boltzmann Machines: A Deep Learning Based Classification Method." Journal of Information and Computational Science 12, no. 14 (September 20, 2015): 5299–307. http://dx.doi.org/10.12733/jics20106538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Kumar, Gulshan, and Krishan Kumar. "The Use of Artificial-Intelligence-Based Ensembles for Intrusion Detection: A Review." Applied Computational Intelligence and Soft Computing 2012 (2012): 1–20. http://dx.doi.org/10.1155/2012/850160.

Full text
Abstract:
In supervised learning-based classification, ensembles have been successfully employed to different application domains. In the literature, many researchers have proposed different ensembles by considering different combination methods, training datasets, base classifiers, and many other factors. Artificial-intelligence-(AI-) based techniques play prominent role in development of ensemble for intrusion detection (ID) and have many benefits over other techniques. However, there is no comprehensive review of ensembles in general and AI-based ensembles for ID to examine and understand their current research status to solve the ID problem. Here, an updated review of ensembles and their taxonomies has been presented in general. The paper also presents the updated review of various AI-based ensembles for ID (in particular) during last decade. The related studies of AI-based ensembles are compared by set of evaluation metrics driven from (1) architecture & approach followed; (2) different methods utilized in different phases of ensemble learning; (3) other measures used to evaluate classification performance of the ensembles. The paper also provides the future directions of the research in this area. The paper will help the better understanding of different directions in which research of ensembles has been done in general and specifically: field of intrusion detection systems (IDSs).
APA, Harvard, Vancouver, ISO, and other styles
22

Gu, Zheng Gang, and Kun Hong Liu. "Microarray Data Classification Based on Evolutionary Multiple Classifier System." Applied Mechanics and Materials 130-134 (October 2011): 2077–80. http://dx.doi.org/10.4028/www.scientific.net/amm.130-134.2077.

Full text
Abstract:
Designing an evolutionary multiple classifier system (MCS) is a relatively new research area. In this paper, we propose a genetic algorithm (GA) based MCS for microarray data classification. We construct a feature poll with different feature selection methods first, and then a multi-objective GA is applied to implement ensemble feature selection process so as to generate a set of classifiers. When this GA stops, a set of base classifiers are generated. Here we use all the nondominated individuals in last generation to build an ensemble system and test the proposed ensemble method and the method that apply a classifier selection process to select proper classifiers from all the individuals in last generation. The experimental results show the proposed ensemble method is roubust and can lead to promising results.
APA, Harvard, Vancouver, ISO, and other styles
23

Choi, Do-Yeon, Kwang-Mo Jeong, and Dong Hoon Lim. "Breast Cancer Classification using Deep Learning-based Ensemble." Journal of Health Informatics and Statistics 43, no. 2 (May 31, 2018): 140–47. http://dx.doi.org/10.21032/jhis.2018.43.2.140.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Yu, Guo-xian, Guo-ji Zhang, Jia Wei, and Ya-zhou Ren. "A Multi Graphs Based Transductive Ensemble Classification Method." Journal of Electronics & Information Technology 33, no. 8 (September 9, 2011): 1883–88. http://dx.doi.org/10.3724/sp.j.1146.2010.01424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

WANG, Xinyue, and Liping JING. "Stratified sampling based ensemble classification for imbalanced data." Journal of Shenzhen University Science and Engineering 36, no. 1 (2019): 24. http://dx.doi.org/10.3724/sp.j.1249.2019.01024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Anshary, Muhammad Adi Khairul, and Bambang Riyanto Trilaksono. "Tweet-based Target Market Classification Using Ensemble Method." Journal of ICT Research and Applications 10, no. 2 (August 31, 2016): 123–39. http://dx.doi.org/10.5614/itbj.ict.res.appl.2016.10.2.3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Alıguliyev, Ramiz M., and Makrufa Sh Hajirahimova. "Classification Ensemble Based Anomaly Detection in Network Traffic." Review of Computer Engineering Research 6, no. 1 (2019): 12–23. http://dx.doi.org/10.18488/journal.76.2019.61.12.23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Markatopoulou, Fotini, Grigorios Tsoumakas, and Ioannis Vlahavas. "Dynamic ensemble pruning based on multi-label classification." Neurocomputing 150 (February 2015): 501–12. http://dx.doi.org/10.1016/j.neucom.2014.07.063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Thenmozhi, K., and M. Rajesh Babu. "Classification of skin disease using ensemble-based classifier." International Journal of Biomedical Engineering and Technology 28, no. 4 (2018): 377. http://dx.doi.org/10.1504/ijbet.2018.095985.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Rajesh Babu, M., and K. Thenmozhi. "Classification of skin disease using ensemble-based classifier." International Journal of Biomedical Engineering and Technology 28, no. 4 (2018): 377. http://dx.doi.org/10.1504/ijbet.2018.10017204.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Yu, Kai, Lihong Wang, and Yanwei Yu. "Ordering-Based Kalman Filter Selective Ensemble for Classification." IEEE Access 8 (2020): 9715–27. http://dx.doi.org/10.1109/access.2020.2964849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

SINGH, Sinam Ajitkumar, and Swanirbhar MAJUMDER. "Short unsegmented PCG classification based on ensemble classifier." TURKISH JOURNAL OF ELECTRICAL ENGINEERING & COMPUTER SCIENCES 28, no. 2 (March 28, 2020): 875–89. http://dx.doi.org/10.3906/elk-1905-165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Soares, Rodrigo G. F., Huanhuan Chen, and Xin Yao. "A Cluster-Based Semisupervised Ensemble for Multiclass Classification." IEEE Transactions on Emerging Topics in Computational Intelligence 1, no. 6 (December 2017): 408–20. http://dx.doi.org/10.1109/tetci.2017.2743219.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Wu, Hao. "Solder joint defect classification based on ensemble learning." Soldering & Surface Mount Technology 29, no. 3 (June 5, 2017): 164–70. http://dx.doi.org/10.1108/ssmt-08-2016-0016.

Full text
Abstract:
Purpose This paper aims to inspect the defects of solder joints of printed circuit board in real-time production line, simple computing and high accuracy are primary consideration factors for feature extraction and classification algorithm. Design/methodology/approach In this study, the author presents an ensemble method for the classification of solder joint defects. The new method is based on extracting the color and geometry features after solder image acquisition and using decision trees to guarantee the algorithm’s running executive efficiency. To improve algorithm accuracy, the author proposes an ensemble method of random forest which combined several trees for the classification of solder joints. Findings The proposed method has been tested using 280 samples of solder joints, including good and various defect types, for experiments. The results show that the proposed method has a high accuracy. Originality/value The author extracted the color and geometry features and used decision trees to guarantee the algorithm's running executive efficiency. To improve the algorithm accuracy, the author proposes using an ensemble method of random forest which combined several trees for the classification of solder joints. The results show that the proposed method has a high accuracy.
APA, Harvard, Vancouver, ISO, and other styles
35

Ahmed, Mahreen, Asma Ghulam Rasool, Hammad Afzal, and Imran Siddiqi. "Improving handwriting based gender classification using ensemble classifiers." Expert Systems with Applications 85 (November 2017): 158–68. http://dx.doi.org/10.1016/j.eswa.2017.05.033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Efendi, Emre, and Berkan Dulek. "Online EM-Based Ensemble Classification With Correlated Agents." IEEE Signal Processing Letters 28 (2021): 294–98. http://dx.doi.org/10.1109/lsp.2021.3052135.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Xu, Jian, and Yuqing Zhai. "A Toxic Comment Classification Model Based on Ensemble." Journal of Physics: Conference Series 1873, no. 1 (April 1, 2021): 012080. http://dx.doi.org/10.1088/1742-6596/1873/1/012080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Thammasiri, Dech, and Phayung Meesad. "Ensemble Data Classification based on Diversity of Classifiers Optimized by Genetic Algorithm." Advanced Materials Research 433-440 (January 2012): 6572–78. http://dx.doi.org/10.4028/www.scientific.net/amr.433-440.6572.

Full text
Abstract:
In this research we propose an ensemble classification technique base on creating classification from a variety of techniques such as decision trees, support vector machines, neural networks and then choosing optimize the appropriate classifiers by genetic algorithm and also combined by a majority vote in order to increase classification accuracy. From classification accuracy test on Australian Credit, German Credit and Bankruptcy Data, we found that the proposed ensemble classification models selected by genetic algorithm yields highest performance and our algorithms are effective in building ensemble.
APA, Harvard, Vancouver, ISO, and other styles
39

Thammasiri, Dech, and Phayung Meesad. "Adaboost Ensemble Data Classification Based on Diversity of Classifiers." Advanced Materials Research 403-408 (November 2011): 3682–87. http://dx.doi.org/10.4028/www.scientific.net/amr.403-408.3682.

Full text
Abstract:
In this research we propose an ensemble classification technique based on decision tree, artificial neural network, and support vector machine models weighting classifier by adaboost in order to increase classification accuracy. we used a total of 30 classifiers. The technique generated random data used Bootstrap. Testing Diabites Data from UCI, classification accuracy tests on Diabites Data found that the proposed ensemble classification models weighting classifier by Adaboost yields better performance than that of a single model with the same type of classifier. The result as follows, Diabites Data achieved the best performance with 75.21%. we can conclude that there are two essential requirements in the model. The first is that the ensemble members or learning agents must be diverse or complementary, i.e., agents must exhibit different properties. Another condition is that an optimal ensemble strategy is also required to fuse a set of diverse by AdaBoost.
APA, Harvard, Vancouver, ISO, and other styles
40

Mikryukov, A. A., A. V. Babash, and V. A. Sizov. "Classifcation of events in information security systems based on neural networks." Open Education 23, no. 1 (March 21, 2019): 57–63. http://dx.doi.org/10.21686/1818-4243-2019-1-57-63.

Full text
Abstract:
Purpose of the research.The aim of the study is to increase the effectiveness of information security and to enhance accuracy and promptness of the classification of security events, security incidents, and threats in information security systems. To respond to this challenge, neural network technologies were suggested as a classification tool for information security systems. These technologies allow accommodating incomplete, inaccurate and unidentified raw data, as well as utilizing previously accumulated information on security issues. To address the problem more effectively, collective methods based on collective neural ensembles aligned with an advanced complex approach were implemented.Materials and methods:When solving complex classification problems, often none of the classification algorithms provides the required accuracy. In such cases, it seems reasonable to build compositions of algorithms, mutually compensating errors of individual algorithms. The study also gives an insight into the application of neural network ensemble to address security issues in the corporate information system and provides a brief review of existing approaches to the construction of neural network ensembles and methods to shape problem solving with neural networks classifiers. An advanced integrated approach is proposed to tackle problems of security event classification based on neural network ensembles (neural network committees). The approach is based on a three-step procedure. The stages of the procedure implementation are described. It is shown that the use of this approach facilitates the efficiency of solving the problem.Results:An advanced integrated approach to addressing security event classification based on neural network ensembles (neural network committees) is proposed. This approach applies adaptive reduction of neural network ensemble (selection of the best classifiers is based on the assessment of the compliance degree of the competence area of the private neural network classifier and convergence of the results of private classifiers), as well as the selection and rationale of the voting method (composition or aggregation of outputs of private classifiers). The results of numerical experiments support the effectiveness of the proposed approach.Conclusion:Collectively used artificial neural networks in the form of neural network ensembles (committees of neural networks) will provide more accurate and reliable results of security event classification in the corporate information network. Moreover, an advanced integrated approach to the construction of a neural network ensemble is proposed to facilitate effectiveness of the classification process. The approach is based on the application of the adaptive reduction procedure for the results of private classifiers and the procedure for selecting the method of aggregation of the results of private classifiers. These outcomes will enable advancement of the system control over information security incidents. Finally, the paper defines tendencies and directions of the development of collective solution methods applying neural network ensembles (committees of neural networks).
APA, Harvard, Vancouver, ISO, and other styles
41

Venugopal, K. R., D. R. Sowmya, and P. Deepa Shenoy. "Post classification change detection based on feature-based ensemble classifiers." International Journal of Spatio-Temporal Data Science 1, no. 2 (2021): 149. http://dx.doi.org/10.1504/ijstds.2021.10040051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Sowmya, D. R., P. Deepa Shenoy, and K. R. Venugopal. "Post classification change detection based on feature-based ensemble classifiers." International Journal of Spatio-Temporal Data Science 1, no. 2 (2021): 149. http://dx.doi.org/10.1504/ijstds.2021.116958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Shadman Roodposhti, Majid, Arko Lucieer, Asim Anees, and Brett Bryan. "A Robust Rule-Based Ensemble Framework Using Mean-Shift Segmentation for Hyperspectral Image Classification." Remote Sensing 11, no. 17 (September 1, 2019): 2057. http://dx.doi.org/10.3390/rs11172057.

Full text
Abstract:
This paper assesses the performance of DoTRules—a dictionary of trusted rules—as a supervised rule-based ensemble framework based on the mean-shift segmentation for hyperspectral image classification. The proposed ensemble framework consists of multiple rule sets with rules constructed based on different class frequencies and sequences of occurrences. Shannon entropy was derived for assessing the uncertainty of every rule and the subsequent filtering of unreliable rules. DoTRules is not only a transparent approach for image classification but also a tool to map rule uncertainty, where rule uncertainty assessment can be applied as an estimate of classification accuracy prior to image classification. In this research, the proposed image classification framework is implemented using three world reference hyperspectral image datasets. We found that the overall accuracy of classification using the proposed ensemble framework was superior to state-of-the-art ensemble algorithms, as well as two non-ensemble algorithms, at multiple training sample sizes. We believe DoTRules can be applied more generally to the classification of discrete data such as hyperspectral satellite imagery products.
APA, Harvard, Vancouver, ISO, and other styles
44

Chaudhary, Poonam, and Rashmi Agrawal. "Sensory motor imagery EEG classification based on non-dyadic wavelets using dynamic weighted majority ensemble classification." Intelligent Decision Technologies 15, no. 1 (March 24, 2021): 33–43. http://dx.doi.org/10.3233/idt-200005.

Full text
Abstract:
The classification accuracy has become a significant challenge and an important task in sensory motor imagery (SMI) electroencephalogram (EEG) based Brain Computer interface (BCI) system. This paper compares ensemble classification framework with individual classifiers. The main objective is to reduce the inference of non-stationary and transient information and improves the classification decision in BCI system. The framework comprises the three phases as follows: (1) the EEG signal first decomposes into triadic frequency bands: low pass band, band pass filter and high pass filter to localize α, β and high γ frequency bands within the EEG signals, (2) Then, Common spatial pattern (CSP) algorithm has been applied on the extracted frequencies in phase I to heave out the important features of EEG signal, (3) Further, an existing Dynamic Weighted Majiority (DWM) ensemble classification algorithm has been implemented using features extracted in phase II, for final class label decision. J48, Naive Bayes, Support Vector Machine, and K-Nearest Neighbor classifiers used as base classifiers for making a diverse ensemble of classifiers. A comparative study between individual classifiers and ensemble framework has been included in the paper. Experimental evaluation and assessment of the performance of the proposed model is done on the publically available datasets: BCI Competition IV dataset IIa and BCI Competition III dataset IVa. The ensemble based learning method gave the highest accuracy among all. The average sensitivity, specificity, and accuracy of 85.4%, 86.5%, and 85.6% were achieved with a kappa value of 0.59 using DWM classification.
APA, Harvard, Vancouver, ISO, and other styles
45

Abuassba, Adnan O. M., Dezheng Zhang, Xiong Luo, Ahmad Shaheryar, and Hazrat Ali. "Improving Classification Performance through an Advanced Ensemble Based Heterogeneous Extreme Learning Machines." Computational Intelligence and Neuroscience 2017 (2017): 1–11. http://dx.doi.org/10.1155/2017/3405463.

Full text
Abstract:
Extreme Learning Machine (ELM) is a fast-learning algorithm for a single-hidden layer feedforward neural network (SLFN). It often has good generalization performance. However, there are chances that it might overfit the training data due to having more hidden nodes than needed. To address the generalization performance, we use a heterogeneous ensemble approach. We propose an Advanced ELM Ensemble (AELME) for classification, which includes Regularized-ELM, L2-norm-optimized ELM (ELML2), and Kernel-ELM. The ensemble is constructed by training a randomly chosen ELM classifier on a subset of training data selected through random resampling. The proposed AELM-Ensemble is evolved by employing an objective function of increasing diversity and accuracy among the final ensemble. Finally, the class label of unseen data is predicted using majority vote approach. Splitting the training data into subsets and incorporation of heterogeneous ELM classifiers result in higher prediction accuracy, better generalization, and a lower number of base classifiers, as compared to other models (Adaboost, Bagging, Dynamic ELM ensemble, data splitting ELM ensemble, and ELM ensemble). The validity of AELME is confirmed through classification on several real-world benchmark datasets.
APA, Harvard, Vancouver, ISO, and other styles
46

Chen, Wen, Xinyu Li, Liang Gao, and Weiming Shen. "Improving Computer-Aided Cervical Cells Classification Using Transfer Learning Based Snapshot Ensemble." Applied Sciences 10, no. 20 (October 19, 2020): 7292. http://dx.doi.org/10.3390/app10207292.

Full text
Abstract:
Cervical cells classification is a crucial component of computer-aided cervical cancer detection. Fine-grained classification is of great clinical importance when guiding clinical decisions on the diagnoses and treatment, which remains very challenging. Recently, convolutional neural networks (CNN) provide a novel way to classify cervical cells by using automatically learned features. Although the ensemble of CNN models can increase model diversity and potentially boost the classification accuracy, it is a multi-step process, as several CNN models need to be trained respectively and then be selected for ensemble. On the other hand, due to the small training samples, the advantages of powerful CNN models may not be effectively leveraged. In order to address such a challenging issue, this paper proposes a transfer learning based snapshot ensemble (TLSE) method by integrating snapshot ensemble learning with transfer learning in a unified and coordinated way. Snapshot ensemble provides ensemble benefits within a single model training procedure, while transfer learning focuses on the small sample problem in cervical cells classification. Furthermore, a new training strategy is proposed for guaranteeing the combination. The TLSE method is evaluated on a pap-smear dataset called Herlev dataset and is proved to have some superiorities over the exiting methods. It demonstrates that TLSE can improve the accuracy in an ensemble manner with only one single training process for the small sample in fine-grained cervical cells classification.
APA, Harvard, Vancouver, ISO, and other styles
47

Alhayali, Royida A. Ibrahem, Munef Abdullah Ahmed, Yasmin Makki Mohialden, and Ahmed H. Ali. "Efficient method for breast cancer classification based on ensemble hoffeding tree and naïve Bayes." Indonesian Journal of Electrical Engineering and Computer Science 18, no. 2 (May 1, 2020): 1074. http://dx.doi.org/10.11591/ijeecs.v18.i2.pp1074-1080.

Full text
Abstract:
<p><span>The most dangerous type of cancer suffered by women above 35 years of age is breast cancer. Breast Cancer datasets are normally characterized by missing data, high dimensionality, non-normal distribution, class imbalance, noisy, and inconsistency. Classification is a machine learning (ML) process which has a significant role in the prediction of outcomes, and one of the outstanding supervised classification methods in data mining is Naives Bayess Classification (NBC). Naïve Bayes Classifications is good at predicting outcomes and often outperforms other classifications techniques. Ones of the reasons behind this strong performance of NBC is the assumptions of conditional Independences among the initial parameters and the predictors. However, this assumption is not always true and can cause loss of accuracy. Hoeffding trees assume the suitability of using a small sample to select the optimal splitting attribute. This study proposes a new method for improving accuracy of classification of breast cancer datasets. The method proposes the use of Hoeffding trees for normal classification and naïve Bayes for reducing data dimensionality.</span></p>
APA, Harvard, Vancouver, ISO, and other styles
48

Saifan, Ahmad A., and Lina Abu-wardih. "Software Defect Prediction Based on Feature Subset Selection and Ensemble Classification." ECTI Transactions on Computer and Information Technology (ECTI-CIT) 14, no. 2 (October 9, 2020): 213–28. http://dx.doi.org/10.37936/ecti-cit.2020142.224489.

Full text
Abstract:
Two primary issues have emerged in the machine learning and data mining community: how to deal with imbalanced data and how to choose appropriate features. These are of particular concern in the software engineering domain, and more specifically the field of software defect prediction. This research highlights a procedure which includes a feature selection technique to single out relevant attributes, and an ensemble technique to handle the class-imbalance issue. In order to determine the advantages of feature selection and ensemble methods we look at two potential scenarios: (1) Ensemble models constructed from the original datasets, without feature selection; (2) Ensemble models constructed from the reduced datasets after feature selection has been applied. Four feature selection techniques are employed: Principal Component Analysis (PCA), Pearson’s correlation, Greedy Stepwise Forward selection, and Information Gain (IG). The aim of this research is to assess the effectiveness of feature selection techniques using ensemble techniques. Five datasets, obtained from the PROMISE software depository, are analyzed; tentative results indicate that ensemble methods can improve the model's performance without the use of feature selection techniques. PCA feature selection and bagging based on K-NN perform better than both bagging based on SVM and boosting based on K-NN and SVM, and feature selection techniques including Pearson’s correlation, Greedy stepwise, and IG weaken the ensemble models’ performance.
APA, Harvard, Vancouver, ISO, and other styles
49

Naz, Mehreen, Kashif Zafar, and Ayesha Khan. "Ensemble Based Classification of Sentiments Using Forest Optimization Algorithm." Data 4, no. 2 (May 23, 2019): 76. http://dx.doi.org/10.3390/data4020076.

Full text
Abstract:
Feature subset selection is a process to choose a set of relevant features from a high dimensionality dataset to improve the performance of classifiers. The meaningful words extracted from data forms a set of features for sentiment analysis. Many evolutionary algorithms, like the Genetic Algorithm (GA) and Particle Swarm Optimization (PSO), have been applied to feature subset selection problem and computational performance can still be improved. This research presents a solution to feature subset selection problem for classification of sentiments using ensemble-based classifiers. It consists of a hybrid technique of minimum redundancy and maximum relevance (mRMR) and Forest Optimization Algorithm (FOA)-based feature selection. Ensemble-based classification is implemented to optimize the results of individual classifiers. The Forest Optimization Algorithm as a feature selection technique has been applied to various classification datasets from the UCI machine learning repository. The classifiers used for ensemble methods for UCI repository datasets are the k-Nearest Neighbor (k-NN) and Naïve Bayes (NB). For the classification of sentiments, 15–20% improvement has been recorded. The dataset used for classification of sentiments is Blitzer’s dataset consisting of reviews of electronic products. The results are further improved by ensemble of k-NN, NB, and Support Vector Machine (SVM) with an accuracy of 95% for the classification of sentiment tasks.
APA, Harvard, Vancouver, ISO, and other styles
50

Guedj, Benjamin, and Bhargav Srinivasa Desikan. "Kernel-Based Ensemble Learning in Python." Information 11, no. 2 (January 25, 2020): 63. http://dx.doi.org/10.3390/info11020063.

Full text
Abstract:
We propose a new supervised learning algorithm for classification and regression problems where two or more preliminary predictors are available. We introduce KernelCobra, a non-linear learning strategy for combining an arbitrary number of initial predictors. KernelCobra builds on the COBRA algorithm introduced by Biau et al. (2016), which combined estimators based on a notion of proximity of predictions on the training data. While the COBRA algorithm used a binary threshold to declare which training data were close and to be used, we generalise this idea by using a kernel to better encapsulate the proximity information. Such a smoothing kernel provides more representative weights to each of the training points which are used to build the aggregate and final predictor, and KernelCobra systematically outperforms the COBRA algorithm. While COBRA is intended for regression, KernelCobra deals with classification and regression. KernelCobra is included as part of the open source Python package Pycobra (0.2.4 and onward), introduced by Srinivasa Desikan (2018). Numerical experiments were undertaken to assess the performance (in terms of pure prediction and computational complexity) of KernelCobra on real-life and synthetic datasets.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography