Journal articles on the topic 'Boosting and bagging'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Boosting and bagging.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Machova, Kristina, Miroslav Puszta, Frantisek Barcak, and Peter Bednar. "A comparison of the bagging and the boosting methods using the decision trees classifiers." Computer Science and Information Systems 3, no. 2 (2006): 57–72. http://dx.doi.org/10.2298/csis0602057m.
Full textTaser, Pelin Yildirim. "Application of Bagging and Boosting Approaches Using Decision Tree-Based Algorithms in Diabetes Risk Prediction." Proceedings 74, no. 1 (March 4, 2021): 6. http://dx.doi.org/10.3390/proceedings2021074006.
Full textLi, Xiao Bo. "Contrast Research of Two Kinds of Integrated Sorting Algorithms." Advanced Materials Research 433-440 (January 2012): 4025–31. http://dx.doi.org/10.4028/www.scientific.net/amr.433-440.4025.
Full textĆwiklińska-Jurkowska, Małgorzata M. "Performance of Resampling Methods Based on Decision Trees, Parametric and Nonparametric Bayesian Classifiers for Three Medical Datasets." Studies in Logic, Grammar and Rhetoric 35, no. 1 (December 1, 2013): 71–86. http://dx.doi.org/10.2478/slgr-2013-0045.
Full textMartínez-Muñoz, Gonzalo, and Alberto Suárez. "Using boosting to prune bagging ensembles." Pattern Recognition Letters 28, no. 1 (January 2007): 156–65. http://dx.doi.org/10.1016/j.patrec.2006.06.018.
Full textAnctil, F., and N. Lauzon. "Generalisation for neural networks through data sampling and training procedures, with applications to streamflow predictions." Hydrology and Earth System Sciences 8, no. 5 (October 31, 2004): 940–58. http://dx.doi.org/10.5194/hess-8-940-2004.
Full textSadorsky, Perry. "Predicting Gold and Silver Price Direction Using Tree-Based Classifiers." Journal of Risk and Financial Management 14, no. 5 (April 29, 2021): 198. http://dx.doi.org/10.3390/jrfm14050198.
Full textAkhand, M. A. H., Pintu Chandra Shill, and Kazuyuki Murase. "Hybrid Ensemble Construction with Selected Neural Networks." Journal of Advanced Computational Intelligence and Intelligent Informatics 15, no. 6 (August 20, 2011): 652–61. http://dx.doi.org/10.20965/jaciii.2011.p0652.
Full textArrahimi, Ahmad Rusadi, Muhammad Khairi Ihsan, Dwi Kartini, Mohammad Reza Faisal, and Fatma Indriani. "Teknik Bagging Dan Boosting Pada Algoritma CART Untuk Klasifikasi Masa Studi Mahasiswa." Jurnal Sains dan Informatika 5, no. 1 (July 14, 2019): 21–30. http://dx.doi.org/10.34128/jsi.v5i1.171.
Full textIslam, M. M., Xin Yao, S. M. Shahriar Nirjon, M. A. Islam, and K. Murase. "Bagging and Boosting Negatively Correlated Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 38, no. 3 (June 2008): 771–84. http://dx.doi.org/10.1109/tsmcb.2008.922055.
Full textXiao, Tong, Jingbo Zhu, and Tongran Liu. "Bagging and Boosting statistical machine translation systems." Artificial Intelligence 195 (February 2013): 496–527. http://dx.doi.org/10.1016/j.artint.2012.11.005.
Full textZhang, Chun-Xia, Jiang-She Zhang, and Gai-Ying Zhang. "Using Boosting to prune Double-Bagging ensembles." Computational Statistics & Data Analysis 53, no. 4 (February 2009): 1218–31. http://dx.doi.org/10.1016/j.csda.2008.10.040.
Full textBorra, Simone, and Agostino Di Ciaccio. "Improving nonparametric regression methods by bagging and boosting." Computational Statistics & Data Analysis 38, no. 4 (February 2002): 407–20. http://dx.doi.org/10.1016/s0167-9473(01)00068-8.
Full textWang, Boyu, and Joelle Pineau. "Online Bagging and Boosting for Imbalanced Data Streams." IEEE Transactions on Knowledge and Data Engineering 28, no. 12 (December 1, 2016): 3353–66. http://dx.doi.org/10.1109/tkde.2016.2609424.
Full textLemmens, Aurélie, and Christophe Croux. "Bagging and Boosting Classification Trees to Predict Churn." Journal of Marketing Research 43, no. 2 (May 2006): 276–86. http://dx.doi.org/10.1509/jmkr.43.2.276.
Full textGweon, Hyukjun, Shu Li, and Rogemar Mamon. "AN EFFECTIVE BIAS-CORRECTED BAGGING METHOD FOR THE VALUATION OF LARGE VARIABLE ANNUITY PORTFOLIOS." ASTIN Bulletin 50, no. 3 (September 2020): 853–71. http://dx.doi.org/10.1017/asb.2020.28.
Full textOpitz, D., and R. Maclin. "Popular Ensemble Methods: An Empirical Study." Journal of Artificial Intelligence Research 11 (August 1, 1999): 169–98. http://dx.doi.org/10.1613/jair.614.
Full textGhasemian, N., and M. Akhoondzadeh. "FUSION OF NON-THERMAL AND THERMAL SATELLITE IMAGES BY BOOSTED SVM CLASSIFIERS FOR CLOUD DETECTION." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W4 (September 26, 2017): 83–89. http://dx.doi.org/10.5194/isprs-archives-xlii-4-w4-83-2017.
Full textSubasi, Abdulhamit, Asalah Fllatah, Kholoud Alzobidi, Tayeb Brahimi, and Akila Sarirete. "Smartphone-Based Human Activity Recognition Using Bagging and Boosting." Procedia Computer Science 163 (2019): 54–61. http://dx.doi.org/10.1016/j.procs.2019.12.086.
Full textRidgeway, Greg. "Looking for lumps: boosting and bagging for density estimation." Computational Statistics & Data Analysis 38, no. 4 (February 2002): 379–92. http://dx.doi.org/10.1016/s0167-9473(01)00066-4.
Full textDitzler, Gregory, Joseph LaBarck, James Ritchie, Gail Rosen, and Robi Polikar. "Extensions to Online Feature Selection Using Bagging and Boosting." IEEE Transactions on Neural Networks and Learning Systems 29, no. 9 (September 2018): 4504–9. http://dx.doi.org/10.1109/tnnls.2017.2746107.
Full textKotsiantis, Sotiris. "Combining bagging, boosting, rotation forest and random subspace methods." Artificial Intelligence Review 35, no. 3 (December 21, 2010): 223–40. http://dx.doi.org/10.1007/s10462-010-9192-8.
Full textGupta, Surbhi, and Munish Kumar. "Forensic document examination system using boosting and bagging methodologies." Soft Computing 24, no. 7 (August 14, 2019): 5409–26. http://dx.doi.org/10.1007/s00500-019-04297-5.
Full textTuysuzoglu, Goksu, and Derya Birant. "Enhanced Bagging (eBagging): A Novel Approach for Ensemble Learning." International Arab Journal of Information Technology 17, no. 4 (July 1, 2020): 515–28. http://dx.doi.org/10.34028/iajit/17/4/10.
Full textYaman, Emine, and Abdulhamit Subasi. "Comparison of Bagging and Boosting Ensemble Machine Learning Methods for Automated EMG Signal Classification." BioMed Research International 2019 (October 31, 2019): 1–13. http://dx.doi.org/10.1155/2019/9152506.
Full textKotsiantis, Sotiris B. "Bagging and boosting variants for handling classifications problems: a survey." Knowledge Engineering Review 29, no. 1 (August 23, 2013): 78–100. http://dx.doi.org/10.1017/s0269888913000313.
Full textBryndin, Evgeniy, and Irina Bryndina. "Communicative-Associative Transition to Smart Artificial Intelligence by Criteria with Help of Ensembles of Diversified Agents." Budapest International Research in Exact Sciences (BirEx) Journal 2, no. 4 (October 9, 2020): 418–35. http://dx.doi.org/10.33258/birex.v2i4.1256.
Full textYılmaz Isıkhan, Selen, Erdem Karabulut, and Celal Reha Alpar. "Determining Cutoff Point of Ensemble Trees Based on Sample Size in Predicting Clinical Dose with DNA Microarray Data." Computational and Mathematical Methods in Medicine 2016 (2016): 1–9. http://dx.doi.org/10.1155/2016/6794916.
Full textMałgorzata, Ćwiklińska-Jurkowska. "Boosting, Bagging and Fixed Fusion Methods Performance for Aiding Diagnosis." Biocybernetics and Biomedical Engineering 32, no. 2 (2012): 17–31. http://dx.doi.org/10.1016/s0208-5216(12)70034-7.
Full textSawarn, Aman, Ankit, and Monika Gupta. "Comparative Analysis of Bagging and Boosting Algorithms for Sentiment Analysis." Procedia Computer Science 173 (2020): 210–15. http://dx.doi.org/10.1016/j.procs.2020.06.025.
Full textKhoshgoftaar, Taghi M., Jason Van Hulse, and Amri Napolitano. "Comparing Boosting and Bagging Techniques With Noisy and Imbalanced Data." IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 41, no. 3 (May 2011): 552–68. http://dx.doi.org/10.1109/tsmca.2010.2084081.
Full textSkurichina, Marina, and Robert P. W. Duin. "Bagging, Boosting and the Random Subspace Method for Linear Classifiers." Pattern Analysis & Applications 5, no. 2 (June 7, 2002): 121–35. http://dx.doi.org/10.1007/s100440200011.
Full textNguyen, Kieu Anh, Walter Chen, Bor-Shiun Lin, and Uma Seeboonruang. "Comparison of Ensemble Machine Learning Methods for Soil Erosion Pin Measurements." ISPRS International Journal of Geo-Information 10, no. 1 (January 19, 2021): 42. http://dx.doi.org/10.3390/ijgi10010042.
Full textHou, Shuai, Fuan Hua, Wu Lv, Zhaodong Wang, Yujia Liu, and Guodong Wang. "Hybrid Modeling of Flotation Height in Air Flotation Oven Based on Selective Bagging Ensemble Method." Mathematical Problems in Engineering 2013 (2013): 1–9. http://dx.doi.org/10.1155/2013/281523.
Full textSchwenk, Holger, and Yoshua Bengio. "Boosting Neural Networks." Neural Computation 12, no. 8 (August 1, 2000): 1869–87. http://dx.doi.org/10.1162/089976600300015178.
Full textJavidi, Mohammad Masoud. "Learning from Imbalanced Multi-label Data Sets by Using Ensemble Strategies." Computer Engineering and Applications Journal 4, no. 1 (February 18, 2015): 61–81. http://dx.doi.org/10.18495/comengapp.v4i1.109.
Full textA, Ramaswamyreddy, Shiva Prasad S, K. V. Rangarao, and A. Saranya. "Efficient datamining model for prediction of chronic kidney disease using wrapper methods." International Journal of Informatics and Communication Technology (IJ-ICT) 8, no. 2 (April 20, 2019): 63. http://dx.doi.org/10.11591/ijict.v8i2.pp63-70.
Full textYUN, Yeboon, and Hirotaka NAKAYAMA. "1103 Effective Learning in Support Vector Machines using Bagging and Boosting." Proceedings of the Optimization Symposium 2014.11 (2014): _1103–1_—_1103–5_. http://dx.doi.org/10.1299/jsmeopt.2014.11.0__1103-1_.
Full textAbubacker, Nirase Fathima, Ibrahim Abaker Targio Hashem, and Lim Kun Hui. "Mammographic Classification Using Stacked Ensemble Learning with Bagging and Boosting Techniques." Journal of Medical and Biological Engineering 40, no. 6 (October 8, 2020): 908–16. http://dx.doi.org/10.1007/s40846-020-00567-y.
Full textBaumgartner, Dustin, and Gursel Serpen. "Performance of global–local hybrid ensemble versus boosting and bagging ensembles." International Journal of Machine Learning and Cybernetics 4, no. 4 (April 25, 2012): 301–17. http://dx.doi.org/10.1007/s13042-012-0094-8.
Full textNai-Arun, Nongyao, and Punnee Sittidech. "Ensemble Learning Model for Diabetes Classification." Advanced Materials Research 931-932 (May 2014): 1427–31. http://dx.doi.org/10.4028/www.scientific.net/amr.931-932.1427.
Full textGoudman, Lisa, Jean-Pierre Van Buyten, Ann De Smedt, Iris Smet, Marieke Devos, Ali Jerjir, and Maarten Moens. "Predicting the Response of High Frequency Spinal Cord Stimulation in Patients with Failed Back Surgery Syndrome: A Retrospective Study with Machine Learning Techniques." Journal of Clinical Medicine 9, no. 12 (December 21, 2020): 4131. http://dx.doi.org/10.3390/jcm9124131.
Full textPal, Raj Kumar, Jugal Chaturvedi, V. Sai Teja, and Leena Shibu. "Applying Ensemble Approach on U.S. Census Data Classification." International Journal of Computer Science and Mobile Computing 10, no. 9 (September 30, 2021): 1–11. http://dx.doi.org/10.47760/ijcsmc.2021.v10i09.001.
Full textRamraj, S., S. Saranya, and K. Yashwant. "Comparative study of bagging, boosting and convolutional neural network for text classification." Indian Journal of Public Health Research & Development 9, no. 9 (2018): 1041. http://dx.doi.org/10.5958/0976-5506.2018.01138.5.
Full textNg, Wing W. Y., Xiancheng Zhou, Xing Tian, Xizhao Wang, and Daniel S. Yeung. "Bagging–boosting-based semi-supervised multi-hashing with query-adaptive re-ranking." Neurocomputing 275 (January 2018): 916–23. http://dx.doi.org/10.1016/j.neucom.2017.09.042.
Full textKuncheva, L. I., M. Skurichina, and R. P. W. Duin. "An experimental study on diversity for bagging and boosting with linear classifiers." Information Fusion 3, no. 4 (December 2002): 245–58. http://dx.doi.org/10.1016/s1566-2535(02)00093-3.
Full textShrivastava, Santosh, P. Mary Jeyanthi, Sarbjit Singh, and David McMillan. "Failure prediction of Indian Banks using SMOTE, Lasso regression, bagging and boosting." Cogent Economics & Finance 8, no. 1 (January 1, 2020): 1729569. http://dx.doi.org/10.1080/23322039.2020.1729569.
Full textWang, Guan-Wei, Chun-Xia Zhang, and Gao Guo. "Investigating the Effect of Randomly Selected Feature Subsets on Bagging and Boosting." Communications in Statistics - Simulation and Computation 44, no. 3 (August 25, 2014): 636–46. http://dx.doi.org/10.1080/03610918.2013.788705.
Full textShrestha, D. L., and D. P. Solomatine. "Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression." Neural Computation 18, no. 7 (July 2006): 1678–710. http://dx.doi.org/10.1162/neco.2006.18.7.1678.
Full textJurek, Anna, Yaxin Bi, Shengli Wu, and Chris Nugent. "A survey of commonly used ensemble-based classification techniques." Knowledge Engineering Review 29, no. 5 (May 3, 2013): 551–81. http://dx.doi.org/10.1017/s0269888913000155.
Full text