To see the other types of publications on this topic, follow the link: Boosting and bagging.

Journal articles on the topic 'Boosting and bagging'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Boosting and bagging.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Machova, Kristina, Miroslav Puszta, Frantisek Barcak, and Peter Bednar. "A comparison of the bagging and the boosting methods using the decision trees classifiers." Computer Science and Information Systems 3, no. 2 (2006): 57–72. http://dx.doi.org/10.2298/csis0602057m.

Full text
Abstract:
In this paper we present an improvement of the precision of classification algorithm results. Two various approaches are known: bagging and boosting. This paper describes a set of experiments with bagging and boosting methods. Our use of these methods aims at classification algorithms generating decision trees. Results of performance tests focused on the use of the bagging and boosting methods in connection with binary decision trees are presented. The minimum number of decision trees, which enables an improvement of the classification performed by the bagging and boosting methods, was found.
APA, Harvard, Vancouver, ISO, and other styles
2

Taser, Pelin Yildirim. "Application of Bagging and Boosting Approaches Using Decision Tree-Based Algorithms in Diabetes Risk Prediction." Proceedings 74, no. 1 (2021): 6. http://dx.doi.org/10.3390/proceedings2021074006.

Full text
Abstract:
Diabetes is a serious condition that leads to high blood sugar and the prediction of this disease at an early stage is of great importance for reducing the risk of some significant diabetes complications. In this study, bagging and boosting approaches using six different decision tree-based (DTB) classifiers were implemented on experimental data for diabetes prediction. This paper also compares applied individual implementation, bagging, and boosting of DTB classifiers in terms of accuracy rates. The results indicate that the bagging and boosting approaches outperform the individual DTB classi
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Xiao Bo. "Contrast Research of Two Kinds of Integrated Sorting Algorithms." Advanced Materials Research 433-440 (January 2012): 4025–31. http://dx.doi.org/10.4028/www.scientific.net/amr.433-440.4025.

Full text
Abstract:
Boosting and Bagging are two kinds of important voting sorting algorithms. Boosting algorithm can generate multiple classifiers by serialization through adjustment of sample weight; Bagging can generate multiple classifiers by parallelization. Different algorithms are composed of different loss and different integration mode, through integration of Bagging and Boosting algorithm and naïve Bayes algorithm, the Bagging NB and AdaBoost NB algorithms are constructed. Through experiment contrast of UCI data set, the result shows Bagging NB algorithm is relatively stable, it can produce the sorting
APA, Harvard, Vancouver, ISO, and other styles
4

Ćwiklińska-Jurkowska, Małgorzata M. "Performance of Resampling Methods Based on Decision Trees, Parametric and Nonparametric Bayesian Classifiers for Three Medical Datasets." Studies in Logic, Grammar and Rhetoric 35, no. 1 (2013): 71–86. http://dx.doi.org/10.2478/slgr-2013-0045.

Full text
Abstract:
Abstract The figures visualizing single and combined classifiers coming from decision trees group and Bayesian parametric and nonparametric discriminant functions show the importance of diversity of bagging or boosting combined models and confirm some theoretical outcomes suggested by other authors. For the three medical sets examined, decision trees, as well as linear and quadratic discriminant functions are useful for bagging and boosting. Classifiers, which do not show an increasing tendency for resubstitution errors in subsequent boosting deterministic procedures loops, are not useful for
APA, Harvard, Vancouver, ISO, and other styles
5

Martínez-Muñoz, Gonzalo, and Alberto Suárez. "Using boosting to prune bagging ensembles." Pattern Recognition Letters 28, no. 1 (2007): 156–65. http://dx.doi.org/10.1016/j.patrec.2006.06.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Anctil, F., and N. Lauzon. "Generalisation for neural networks through data sampling and training procedures, with applications to streamflow predictions." Hydrology and Earth System Sciences 8, no. 5 (2004): 940–58. http://dx.doi.org/10.5194/hess-8-940-2004.

Full text
Abstract:
Abstract. Since the 1990s, neural networks have been applied to many studies in hydrology and water resources. Extensive reviews on neural network modelling have identified the major issues affecting modelling performance; one of the most important is generalisation, which refers to building models that can infer the behaviour of the system under study for conditions represented not only in the data employed for training and testing but also for those conditions not present in the data sets but inherent to the system. This work compares five generalisation approaches: stop training, Bayesian r
APA, Harvard, Vancouver, ISO, and other styles
7

Sadorsky, Perry. "Predicting Gold and Silver Price Direction Using Tree-Based Classifiers." Journal of Risk and Financial Management 14, no. 5 (2021): 198. http://dx.doi.org/10.3390/jrfm14050198.

Full text
Abstract:
Gold is often used by investors as a hedge against inflation or adverse economic times. Consequently, it is important for investors to have accurate forecasts of gold prices. This paper uses several machine learning tree-based classifiers (bagging, stochastic gradient boosting, random forests) to predict the price direction of gold and silver exchange traded funds. Decision tree bagging, stochastic gradient boosting, and random forests predictions of gold and silver price direction are much more accurate than those obtained from logit models. For a 20-day forecast horizon, tree bagging, stocha
APA, Harvard, Vancouver, ISO, and other styles
8

Akhand, M. A. H., Pintu Chandra Shill, and Kazuyuki Murase. "Hybrid Ensemble Construction with Selected Neural Networks." Journal of Advanced Computational Intelligence and Intelligent Informatics 15, no. 6 (2011): 652–61. http://dx.doi.org/10.20965/jaciii.2011.p0652.

Full text
Abstract:
A Neural Network Ensemble (NNE) is convenient for improving classification task performance. Among the remarkable number of methods based on different techniques for constructing NNEs, Negative Correlation Learning (NCL), bagging, and boosting are the most popular. None of them, however, could show better performance for all problems. To improve performance combining the complementary strengths of the individual methods, we propose two different ways to construct hybrid ensembles combining NCL with bagging and boosting. One produces a pool of predefined numbers of networks using standard NCL a
APA, Harvard, Vancouver, ISO, and other styles
9

Arrahimi, Ahmad Rusadi, Muhammad Khairi Ihsan, Dwi Kartini, Mohammad Reza Faisal, and Fatma Indriani. "Teknik Bagging Dan Boosting Pada Algoritma CART Untuk Klasifikasi Masa Studi Mahasiswa." Jurnal Sains dan Informatika 5, no. 1 (2019): 21–30. http://dx.doi.org/10.34128/jsi.v5i1.171.

Full text
Abstract:
Undergraduate Students data in academic information systems always increases every year. Data collected can be processed using data mining to gain new knowledge. The author tries to mine undergraduate students data to classify the study period on time or not on time. The data is analyzed using CART with bagging techniqu, and CART with boosting technique. The classification results using 49 testing data, in the CART algorithm with bagging techniques 13 data (26.531%) entered into the classification on time and 36 data (73.469%) entered into the classification not on time. In the CART algorithm
APA, Harvard, Vancouver, ISO, and other styles
10

Islam, M. M., Xin Yao, S. M. Shahriar Nirjon, M. A. Islam, and K. Murase. "Bagging and Boosting Negatively Correlated Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 38, no. 3 (2008): 771–84. http://dx.doi.org/10.1109/tsmcb.2008.922055.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Xiao, Tong, Jingbo Zhu, and Tongran Liu. "Bagging and Boosting statistical machine translation systems." Artificial Intelligence 195 (February 2013): 496–527. http://dx.doi.org/10.1016/j.artint.2012.11.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Zhang, Chun-Xia, Jiang-She Zhang, and Gai-Ying Zhang. "Using Boosting to prune Double-Bagging ensembles." Computational Statistics & Data Analysis 53, no. 4 (2009): 1218–31. http://dx.doi.org/10.1016/j.csda.2008.10.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Borra, Simone, and Agostino Di Ciaccio. "Improving nonparametric regression methods by bagging and boosting." Computational Statistics & Data Analysis 38, no. 4 (2002): 407–20. http://dx.doi.org/10.1016/s0167-9473(01)00068-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Wang, Boyu, and Joelle Pineau. "Online Bagging and Boosting for Imbalanced Data Streams." IEEE Transactions on Knowledge and Data Engineering 28, no. 12 (2016): 3353–66. http://dx.doi.org/10.1109/tkde.2016.2609424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Lemmens, Aurélie, and Christophe Croux. "Bagging and Boosting Classification Trees to Predict Churn." Journal of Marketing Research 43, no. 2 (2006): 276–86. http://dx.doi.org/10.1509/jmkr.43.2.276.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Gweon, Hyukjun, Shu Li, and Rogemar Mamon. "AN EFFECTIVE BIAS-CORRECTED BAGGING METHOD FOR THE VALUATION OF LARGE VARIABLE ANNUITY PORTFOLIOS." ASTIN Bulletin 50, no. 3 (2020): 853–71. http://dx.doi.org/10.1017/asb.2020.28.

Full text
Abstract:
AbstractTo evaluate a large portfolio of variable annuity (VA) contracts, many insurance companies rely on Monte Carlo simulation, which is computationally intensive. To address this computational challenge, machine learning techniques have been adopted in recent years to estimate the fair market values (FMVs) of a large number of contracts. It is shown that bootstrapped aggregation (bagging), one of the most popular machine learning algorithms, performs well in valuing VA contracts using related attributes. In this article, we highlight the presence of prediction bias of bagging and use the b
APA, Harvard, Vancouver, ISO, and other styles
17

Opitz, D., and R. Maclin. "Popular Ensemble Methods: An Empirical Study." Journal of Artificial Intelligence Research 11 (August 1, 1999): 169–98. http://dx.doi.org/10.1613/jair.614.

Full text
Abstract:
An ensemble consists of a set of individually trained classifiers (such as neural networks or decision trees) whose predictions are combined when classifying novel instances. Previous research has shown that an ensemble is often more accurate than any of the single classifiers in the ensemble. Bagging (Breiman, 1996c) and Boosting (Freund & Shapire, 1996; Shapire, 1990) are two relatively new but popular methods for producing ensembles. In this paper we evaluate these methods on 23 data sets using both neural networks and decision trees as our classification algorithm. Our results clearly
APA, Harvard, Vancouver, ISO, and other styles
18

Ghasemian, N., and M. Akhoondzadeh. "FUSION OF NON-THERMAL AND THERMAL SATELLITE IMAGES BY BOOSTED SVM CLASSIFIERS FOR CLOUD DETECTION." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W4 (September 26, 2017): 83–89. http://dx.doi.org/10.5194/isprs-archives-xlii-4-w4-83-2017.

Full text
Abstract:
The goal of ensemble learning methods like Bagging and Boosting is to improve the classification results of some weak classifiers gradually. Usually, Boosting algorithms show better results than Bagging. In this article, we have examined the possibility of fusion of non-thermal and thermal bands of Landsat 8 satellite images for cloud detection by using the boosting method. We used SVM as a base learner and the performance of two kinds of Boosting methods including AdaBoost.M1 and σ Boost was compared on remote sensing images of Landsat 8 satellite. We first extracted the co-occurrence matrix
APA, Harvard, Vancouver, ISO, and other styles
19

Subasi, Abdulhamit, Asalah Fllatah, Kholoud Alzobidi, Tayeb Brahimi, and Akila Sarirete. "Smartphone-Based Human Activity Recognition Using Bagging and Boosting." Procedia Computer Science 163 (2019): 54–61. http://dx.doi.org/10.1016/j.procs.2019.12.086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Ridgeway, Greg. "Looking for lumps: boosting and bagging for density estimation." Computational Statistics & Data Analysis 38, no. 4 (2002): 379–92. http://dx.doi.org/10.1016/s0167-9473(01)00066-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ditzler, Gregory, Joseph LaBarck, James Ritchie, Gail Rosen, and Robi Polikar. "Extensions to Online Feature Selection Using Bagging and Boosting." IEEE Transactions on Neural Networks and Learning Systems 29, no. 9 (2018): 4504–9. http://dx.doi.org/10.1109/tnnls.2017.2746107.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Kotsiantis, Sotiris. "Combining bagging, boosting, rotation forest and random subspace methods." Artificial Intelligence Review 35, no. 3 (2010): 223–40. http://dx.doi.org/10.1007/s10462-010-9192-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Gupta, Surbhi, and Munish Kumar. "Forensic document examination system using boosting and bagging methodologies." Soft Computing 24, no. 7 (2019): 5409–26. http://dx.doi.org/10.1007/s00500-019-04297-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Tuysuzoglu, Goksu, and Derya Birant. "Enhanced Bagging (eBagging): A Novel Approach for Ensemble Learning." International Arab Journal of Information Technology 17, no. 4 (2020): 515–28. http://dx.doi.org/10.34028/iajit/17/4/10.

Full text
Abstract:
Bagging is one of the well-known ensemble learning methods, which combines several classifiers trained on different subsamples of the dataset. However, a drawback of bagging is its random selection, where the classification performance depends on chance to choose a suitable subset of training objects. This paper proposes a novel modified version of bagging, named enhanced Bagging (eBagging), which uses a new mechanism (error-based bootstrapping) when constructing training sets in order to cope with this problem. In the experimental setting, the proposed eBagging technique was tested on 33 well
APA, Harvard, Vancouver, ISO, and other styles
25

Yaman, Emine, and Abdulhamit Subasi. "Comparison of Bagging and Boosting Ensemble Machine Learning Methods for Automated EMG Signal Classification." BioMed Research International 2019 (October 31, 2019): 1–13. http://dx.doi.org/10.1155/2019/9152506.

Full text
Abstract:
The neuromuscular disorders are diagnosed using electromyographic (EMG) signals. Machine learning algorithms are employed as a decision support system to diagnose neuromuscular disorders. This paper compares bagging and boosting ensemble learning methods to classify EMG signals automatically. Even though ensemble classifiers’ efficacy in relation to real-life issues has been presented in numerous studies, there are almost no studies which focus on the feasibility of bagging and boosting ensemble classifiers to diagnose the neuromuscular disorders. Therefore, the purpose of this paper is to ass
APA, Harvard, Vancouver, ISO, and other styles
26

Kotsiantis, Sotiris B. "Bagging and boosting variants for handling classifications problems: a survey." Knowledge Engineering Review 29, no. 1 (2013): 78–100. http://dx.doi.org/10.1017/s0269888913000313.

Full text
Abstract:
AbstractBagging and boosting are two of the most well-known ensemble learning methods due to their theoretical performance guarantees and strong experimental results. Since bagging and boosting are an effective and open framework, several researchers have proposed their variants, some of which have turned out to have lower classification error than the original versions. This paper tried to summarize these variants and categorize them into groups. We hope that the references cited cover the major theoretical issues, and provide access to the main branches of the literature dealing with such me
APA, Harvard, Vancouver, ISO, and other styles
27

Bryndin, Evgeniy, and Irina Bryndina. "Communicative-Associative Transition to Smart Artificial Intelligence by Criteria with Help of Ensembles of Diversified Agents." Budapest International Research in Exact Sciences (BirEx) Journal 2, no. 4 (2020): 418–35. http://dx.doi.org/10.33258/birex.v2i4.1256.

Full text
Abstract:
Cognitive virtual smart artificial intelligence can be formed by ensembles of diversified agents with strong artificial intelligence based on communicative-associative logic by recurring development of professional skills, increasing visual, sound, and subject, spatial and temporal sensitivity. Several diversifiable agents that try to get the same conclusion will give a more accurate result, so several diversifiable agents are combined into an ensemble. Then, based on the criteria of utility and preference, the final result is obtained based on the conclusions of diversifying agents. This appr
APA, Harvard, Vancouver, ISO, and other styles
28

Yılmaz Isıkhan, Selen, Erdem Karabulut, and Celal Reha Alpar. "Determining Cutoff Point of Ensemble Trees Based on Sample Size in Predicting Clinical Dose with DNA Microarray Data." Computational and Mathematical Methods in Medicine 2016 (2016): 1–9. http://dx.doi.org/10.1155/2016/6794916.

Full text
Abstract:
Background/Aim. Evaluating the success of dose prediction based on genetic or clinical data has substantially advanced recently. The aim of this study is to predict various clinical dose values from DNA gene expression datasets using data mining techniques. Materials and Methods. Eleven real gene expression datasets containing dose values were included. First, important genes for dose prediction were selected using iterative sure independence screening. Then, the performances of regression trees (RTs), support vector regression (SVR), RT bagging, SVR bagging, and RT boosting were examined. Res
APA, Harvard, Vancouver, ISO, and other styles
29

Małgorzata, Ćwiklińska-Jurkowska. "Boosting, Bagging and Fixed Fusion Methods Performance for Aiding Diagnosis." Biocybernetics and Biomedical Engineering 32, no. 2 (2012): 17–31. http://dx.doi.org/10.1016/s0208-5216(12)70034-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Sawarn, Aman, Ankit, and Monika Gupta. "Comparative Analysis of Bagging and Boosting Algorithms for Sentiment Analysis." Procedia Computer Science 173 (2020): 210–15. http://dx.doi.org/10.1016/j.procs.2020.06.025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Khoshgoftaar, Taghi M., Jason Van Hulse, and Amri Napolitano. "Comparing Boosting and Bagging Techniques With Noisy and Imbalanced Data." IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 41, no. 3 (2011): 552–68. http://dx.doi.org/10.1109/tsmca.2010.2084081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Skurichina, Marina, and Robert P. W. Duin. "Bagging, Boosting and the Random Subspace Method for Linear Classifiers." Pattern Analysis & Applications 5, no. 2 (2002): 121–35. http://dx.doi.org/10.1007/s100440200011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Nguyen, Kieu Anh, Walter Chen, Bor-Shiun Lin, and Uma Seeboonruang. "Comparison of Ensemble Machine Learning Methods for Soil Erosion Pin Measurements." ISPRS International Journal of Geo-Information 10, no. 1 (2021): 42. http://dx.doi.org/10.3390/ijgi10010042.

Full text
Abstract:
Although machine learning has been extensively used in various fields, it has only recently been applied to soil erosion pin modeling. To improve upon previous methods of quantifying soil erosion based on erosion pin measurements, this study explored the possible application of ensemble machine learning algorithms to the Shihmen Reservoir watershed in northern Taiwan. Three categories of ensemble methods were considered in this study: (a) Bagging, (b) boosting, and (c) stacking. The bagging method in this study refers to bagged multivariate adaptive regression splines (bagged MARS) and random
APA, Harvard, Vancouver, ISO, and other styles
34

Hou, Shuai, Fuan Hua, Wu Lv, Zhaodong Wang, Yujia Liu, and Guodong Wang. "Hybrid Modeling of Flotation Height in Air Flotation Oven Based on Selective Bagging Ensemble Method." Mathematical Problems in Engineering 2013 (2013): 1–9. http://dx.doi.org/10.1155/2013/281523.

Full text
Abstract:
The accurate prediction of the flotation height is very necessary for the precise control of the air flotation oven process, therefore, avoiding the scratch and improving production quality. In this paper, a hybrid flotation height prediction model is developed. Firstly, a simplified mechanism model is introduced for capturing the main dynamic behavior of the process. Thereafter, for compensation of the modeling errors existing between actual system and mechanism model, an error compensation model which is established based on the proposed selective bagging ensemble method is proposed for boos
APA, Harvard, Vancouver, ISO, and other styles
35

Schwenk, Holger, and Yoshua Bengio. "Boosting Neural Networks." Neural Computation 12, no. 8 (2000): 1869–87. http://dx.doi.org/10.1162/089976600300015178.

Full text
Abstract:
Boosting is a general method for improving the performance of learning algorithms. A recently proposed boosting algorithm, Ada Boost, has been applied with great success to several benchmark machine learning problems using mainly decision trees as base classifiers. In this article we investigate whether Ada Boost also works as well with neural networks, and we discuss the advantages and drawbacks of different versions of the Ada Boost algorithm. In particular, we compare training methods based on sampling the training set and weighting the cost function. The results suggest that random resampl
APA, Harvard, Vancouver, ISO, and other styles
36

Javidi, Mohammad Masoud. "Learning from Imbalanced Multi-label Data Sets by Using Ensemble Strategies." Computer Engineering and Applications Journal 4, no. 1 (2015): 61–81. http://dx.doi.org/10.18495/comengapp.v4i1.109.

Full text
Abstract:
Multi-label classification is an extension of conventional classification in which a single instance can be associated with multiple labels. Problems of this type are ubiquitous in everyday life. Such as, a movie can be categorized as action, crime, and thriller. Most algorithms on multi-label classification learning are designed for balanced data and don’t work well on imbalanced data. On the other hand, in real applications, most datasets are imbalanced. Therefore, we focused to improve multi-label classification performance on imbalanced datasets. In this paper, a state-of-the-art multi-l
APA, Harvard, Vancouver, ISO, and other styles
37

A, Ramaswamyreddy, Shiva Prasad S, K. V. Rangarao, and A. Saranya. "Efficient datamining model for prediction of chronic kidney disease using wrapper methods." International Journal of Informatics and Communication Technology (IJ-ICT) 8, no. 2 (2019): 63. http://dx.doi.org/10.11591/ijict.v8i2.pp63-70.

Full text
Abstract:
In the present generation, majority of the people are highly affected by kidney diseases. Among them, chronic kidney is the most common life threatening disease which can be prevented by early detection. Histological grade in chronic kidney disease provides clinically important prognostic information. Therefore, machine learning techniques are applied on the information collected from previously diagnosed patients in order to discover the knowledge and patterns for making precise predictions. A large number of features exist in the raw data in which some may cause low information and error; he
APA, Harvard, Vancouver, ISO, and other styles
38

YUN, Yeboon, and Hirotaka NAKAYAMA. "1103 Effective Learning in Support Vector Machines using Bagging and Boosting." Proceedings of the Optimization Symposium 2014.11 (2014): _1103–1_—_1103–5_. http://dx.doi.org/10.1299/jsmeopt.2014.11.0__1103-1_.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Abubacker, Nirase Fathima, Ibrahim Abaker Targio Hashem, and Lim Kun Hui. "Mammographic Classification Using Stacked Ensemble Learning with Bagging and Boosting Techniques." Journal of Medical and Biological Engineering 40, no. 6 (2020): 908–16. http://dx.doi.org/10.1007/s40846-020-00567-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Baumgartner, Dustin, and Gursel Serpen. "Performance of global–local hybrid ensemble versus boosting and bagging ensembles." International Journal of Machine Learning and Cybernetics 4, no. 4 (2012): 301–17. http://dx.doi.org/10.1007/s13042-012-0094-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Nai-Arun, Nongyao, and Punnee Sittidech. "Ensemble Learning Model for Diabetes Classification." Advanced Materials Research 931-932 (May 2014): 1427–31. http://dx.doi.org/10.4028/www.scientific.net/amr.931-932.1427.

Full text
Abstract:
This paper proposed data mining techniques to improve efficiency and reliability in diabetes classification. The real data set collected from Sawanpracharak Regional Hospital, Thailand, was fist analyzed by using gain-ratio feature selection techniques. Three well known algorithms; naïve bayes, k-nearest neighbors and decision tree, were used to construct classification models on the selected features. Then, the popular ensemble learning; bagging and boosting were applied using the three base classifiers. The results revealed that the best model with the highest accuracy was bagging with base
APA, Harvard, Vancouver, ISO, and other styles
42

Goudman, Lisa, Jean-Pierre Van Buyten, Ann De Smedt, et al. "Predicting the Response of High Frequency Spinal Cord Stimulation in Patients with Failed Back Surgery Syndrome: A Retrospective Study with Machine Learning Techniques." Journal of Clinical Medicine 9, no. 12 (2020): 4131. http://dx.doi.org/10.3390/jcm9124131.

Full text
Abstract:
Despite the proven clinical value of spinal cord stimulation (SCS) for patients with failed back surgery syndrome (FBSS), factors related to a successful SCS outcome are not yet clearly understood. This study aimed to predict responders for high frequency SCS at 10 kHz (HF-10). Data before implantation and the last available data was extracted for 119 FBSS patients treated with HF-10 SCS. Correlations, logistic regression, linear discriminant analysis, classification and regression trees, random forest, bagging, and boosting were applied. Based on feature selection, trial pain relief, predomin
APA, Harvard, Vancouver, ISO, and other styles
43

Pal, Raj Kumar, Jugal Chaturvedi, V. Sai Teja, and Leena Shibu. "Applying Ensemble Approach on U.S. Census Data Classification." International Journal of Computer Science and Mobile Computing 10, no. 9 (2021): 1–11. http://dx.doi.org/10.47760/ijcsmc.2021.v10i09.001.

Full text
Abstract:
During this paper, we have a tendency to examine the adult financial gain dataset obtainable at the UC Irvine Machine Learning Repository. To aim predict whether or not associate individual’s financial gain are going to be bigger than $50,000 per annum victimization completely, different boosting and bagging strategies and compare models supported many attributes from the census information.
APA, Harvard, Vancouver, ISO, and other styles
44

Ramraj, S., S. Saranya, and K. Yashwant. "Comparative study of bagging, boosting and convolutional neural network for text classification." Indian Journal of Public Health Research & Development 9, no. 9 (2018): 1041. http://dx.doi.org/10.5958/0976-5506.2018.01138.5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ng, Wing W. Y., Xiancheng Zhou, Xing Tian, Xizhao Wang, and Daniel S. Yeung. "Bagging–boosting-based semi-supervised multi-hashing with query-adaptive re-ranking." Neurocomputing 275 (January 2018): 916–23. http://dx.doi.org/10.1016/j.neucom.2017.09.042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Kuncheva, L. I., M. Skurichina, and R. P. W. Duin. "An experimental study on diversity for bagging and boosting with linear classifiers." Information Fusion 3, no. 4 (2002): 245–58. http://dx.doi.org/10.1016/s1566-2535(02)00093-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Shrivastava, Santosh, P. Mary Jeyanthi, Sarbjit Singh, and David McMillan. "Failure prediction of Indian Banks using SMOTE, Lasso regression, bagging and boosting." Cogent Economics & Finance 8, no. 1 (2020): 1729569. http://dx.doi.org/10.1080/23322039.2020.1729569.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Wang, Guan-Wei, Chun-Xia Zhang, and Gao Guo. "Investigating the Effect of Randomly Selected Feature Subsets on Bagging and Boosting." Communications in Statistics - Simulation and Computation 44, no. 3 (2014): 636–46. http://dx.doi.org/10.1080/03610918.2013.788705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Shrestha, D. L., and D. P. Solomatine. "Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression." Neural Computation 18, no. 7 (2006): 1678–710. http://dx.doi.org/10.1162/neco.2006.18.7.1678.

Full text
Abstract:
The application of boosting technique to regression problems has received relatively little attention in contrast to research aimed at classification problems. This letter describes a new boosting algorithm, AdaBoost.RT, for regression problems. Its idea is in filtering out the examples with the relative estimation error that is higher than the preset threshold value, and then following the AdaBoost procedure. Thus, it requires selecting the suboptimal value of the error threshold to demarcate examples as poorly or well predicted. Some experimental results using the M5 model tree as a weak lea
APA, Harvard, Vancouver, ISO, and other styles
50

Jurek, Anna, Yaxin Bi, Shengli Wu, and Chris Nugent. "A survey of commonly used ensemble-based classification techniques." Knowledge Engineering Review 29, no. 5 (2013): 551–81. http://dx.doi.org/10.1017/s0269888913000155.

Full text
Abstract:
AbstractThe combination of multiple classifiers, commonly referred to as a classifier ensemble, has previously demonstrated the ability to improve classification accuracy in many application domains. As a result this area has attracted significant amount of research in recent years. The aim of this paper has therefore been to provide a state of the art review of the most well-known ensemble techniques with the main focus on bagging, boosting and stacking and to trace the recent attempts, which have been made to improve their performance. Within this paper, we present and compare an updated vie
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!