Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Cost-sensitive classification.

Articles de revues sur le sujet « Cost-sensitive classification »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Cost-sensitive classification ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Wang, Jialei, Peilin Zhao et Steven C. H. Hoi. « Cost-Sensitive Online Classification ». IEEE Transactions on Knowledge and Data Engineering 26, no 10 (octobre 2014) : 2425–38. http://dx.doi.org/10.1109/tkde.2013.157.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Zhang, Shichao. « Cost-sensitive KNN classification ». Neurocomputing 391 (mai 2020) : 234–42. http://dx.doi.org/10.1016/j.neucom.2018.11.101.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Zhao, Peilin, Yifan Zhang, Min Wu, Steven C. H. Hoi, Mingkui Tan et Junzhou Huang. « Adaptive Cost-Sensitive Online Classification ». IEEE Transactions on Knowledge and Data Engineering 31, no 2 (1 février 2019) : 214–28. http://dx.doi.org/10.1109/tkde.2018.2826011.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Cebe, Mumin, et Cigdem Gunduz-Demir. « Qualitative test-cost sensitive classification ». Pattern Recognition Letters 31, no 13 (octobre 2010) : 2043–51. http://dx.doi.org/10.1016/j.patrec.2010.05.028.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Zhang, Shichao. « Cost-sensitive classification with respect to waiting cost ». Knowledge-Based Systems 23, no 5 (juillet 2010) : 369–78. http://dx.doi.org/10.1016/j.knosys.2010.01.008.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Pendharkar, Parag C. « Linear models for cost-sensitive classification ». Expert Systems 32, no 5 (5 juin 2015) : 622–36. http://dx.doi.org/10.1111/exsy.12114.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Ji, Shihao, et Lawrence Carin. « Cost-sensitive feature acquisition and classification ». Pattern Recognition 40, no 5 (mai 2007) : 1474–85. http://dx.doi.org/10.1016/j.patcog.2006.11.008.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Yang, Yi, Yuxuan Guo et Xiangyu Chang. « Angle-based cost-sensitive multicategory classification ». Computational Statistics & ; Data Analysis 156 (avril 2021) : 107107. http://dx.doi.org/10.1016/j.csda.2020.107107.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Tapkan, Pınar, Lale Özbakır, Sinem Kulluk et Adil Baykasoğlu. « A cost-sensitive classification algorithm : BEE-Miner ». Knowledge-Based Systems 95 (mars 2016) : 99–113. http://dx.doi.org/10.1016/j.knosys.2015.12.010.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Wang, Tao, Zhenxing Qin, Shichao Zhang et Chengqi Zhang. « Cost-sensitive classification with inadequate labeled data ». Information Systems 37, no 5 (juillet 2012) : 508–16. http://dx.doi.org/10.1016/j.is.2011.10.009.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
11

Shi, Yinghuan, Yang Gao, Ruili Wang, Ying Zhang et Dong Wang. « Transductive cost-sensitive lung cancer image classification ». Applied Intelligence 38, no 1 (17 mai 2012) : 16–28. http://dx.doi.org/10.1007/s10489-012-0354-z.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
12

Zheng, Weijie, et Hong Zhao. « Cost-sensitive hierarchical classification for imbalance classes ». Applied Intelligence 50, no 8 (4 mars 2020) : 2328–38. http://dx.doi.org/10.1007/s10489-019-01624-z.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
13

Desai, Ankit, et Sanjay Chaudhary. « Distributed AdaBoost Extensions for Cost-sensitive Classification Problems ». International Journal of Computer Applications 177, no 12 (17 octobre 2019) : 1–8. http://dx.doi.org/10.5120/ijca2019919531.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Wang, Zhe, Xu Chu, Dongdong Li, Hai Yang et Weichao Qu. « Cost-sensitive matrixized classification learning with information entropy ». Applied Soft Computing 116 (février 2022) : 108266. http://dx.doi.org/10.1016/j.asoc.2021.108266.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
15

Miao, Yilin, Zhewei Liu, Xiangning Wu et Jie Gao. « Cost-Sensitive Siamese Network for PCB Defect Classification ». Computational Intelligence and Neuroscience 2021 (12 octobre 2021) : 1–13. http://dx.doi.org/10.1155/2021/7550670.

Texte intégral
Résumé :
After the production of printed circuit boards (PCB), PCB manufacturers need to remove defected boards by conducting rigorous testing, while manual inspection is time-consuming and laborious. Many PCB factories employ automatic optical inspection (AOI), but this pixel-based comparison method has a high false alarm rate, thus requiring intensive human inspection to determine whether alarms raised from it resemble true or pseudo defects. In this paper, we propose a new cost-sensitive deep learning model: cost-sensitive siamese network (CSS-Net) based on siamese network, transfer learning and threshold moving methods to distinguish between true and pseudo PCB defects as a cost-sensitive classification problem. We use optimization algorithms such as NSGA-II to determine the optimal cost-sensitive threshold. Results show that our model improves true defects prediction accuracy to 97.60%, and it maintains relatively high pseudo defect prediction accuracy, 61.24% in real-production scenario. Furthermore, our model also outperforms its state-of-the-art competitor models in other comprehensive cost-sensitive metrics, with an average of 33.32% shorter training time.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Shin, Chang-Uk, Jinyoung Oh et Jeong-Won Cha. « Dynamic Cost Sensitive Learning for Imbalanced Text Classification ». KIISE Transactions on Computing Practices 26, no 4 (30 avril 2020) : 211–16. http://dx.doi.org/10.5626/ktcp.2020.26.4.211.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
17

Pan, Shirui, Jia Wu et Xingquan Zhu. « CogBoost : Boosting for Fast Cost-Sensitive Graph Classification ». IEEE Transactions on Knowledge and Data Engineering 27, no 11 (1 novembre 2015) : 2933–46. http://dx.doi.org/10.1109/tkde.2015.2391115.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
18

Zhang, Minghui, Haiwei Pan, Niu Zhang, Xiaoqin Xie, Zhiqiang Zhang et Xiaoning Feng. « Cost-sensitive ensemble classification algorithm for medical image ». International Journal of Computational Science and Engineering 16, no 3 (2018) : 282. http://dx.doi.org/10.1504/ijcse.2018.091763.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
19

Zhang, Zhiqiang, Xiaoning Feng, Xiaoqin Xie, Minghui Zhang, Haiwei Pan et Niu Zhang. « Cost-sensitive ensemble classification algorithm for medical image ». International Journal of Computational Science and Engineering 16, no 3 (2018) : 282. http://dx.doi.org/10.1504/ijcse.2018.10012835.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
20

Scott, Clayton, et Mark Davenport. « Regression Level Set Estimation Via Cost-Sensitive Classification ». IEEE Transactions on Signal Processing 55, no 6 (juin 2007) : 2752–57. http://dx.doi.org/10.1109/tsp.2007.893758.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
21

Sun, Yanmin, Mohamed S. Kamel, Andrew K. C. Wong et Yang Wang. « Cost-sensitive boosting for classification of imbalanced data ». Pattern Recognition 40, no 12 (décembre 2007) : 3358–78. http://dx.doi.org/10.1016/j.patcog.2007.04.009.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
22

Huang, Kuan-Hao, et Hsuan-Tien Lin. « Cost-sensitive label embedding for multi-label classification ». Machine Learning 106, no 9-10 (2 août 2017) : 1725–46. http://dx.doi.org/10.1007/s10994-017-5659-z.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
23

Zidelmal, Z., A. Amirou, D. Ould-Abdeslam et J. Merckle. « ECG beat classification using a cost sensitive classifier ». Computer Methods and Programs in Biomedicine 111, no 3 (septembre 2013) : 570–77. http://dx.doi.org/10.1016/j.cmpb.2013.05.011.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
24

Liu, Fen, et Quan Qian. « Cost-Sensitive Variational Autoencoding Classifier for Imbalanced Data Classification ». Algorithms 15, no 5 (21 avril 2022) : 139. http://dx.doi.org/10.3390/a15050139.

Texte intégral
Résumé :
Classification is among the core tasks in machine learning. Existing classification algorithms are typically based on the assumption of at least roughly balanced data classes. When performing tasks involving imbalanced data, such classifiers ignore the minority data in consideration of the overall accuracy. The performance of traditional classification algorithms based on the assumption of balanced data distribution is insufficient because the minority-class samples are often more important than others, such as positive samples, in disease diagnosis. In this study, we propose a cost-sensitive variational autoencoding classifier that combines data-level and algorithm-level methods to solve the problem of imbalanced data classification. Cost-sensitive factors are introduced to assign a high cost to the misclassification of minority data, which biases the classifier toward minority data. We also designed misclassification costs closely related to tasks by embedding domain knowledge. Experimental results show that the proposed method performed the classification of bulk amorphous materials well.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Teisseyre, Paweł, Damien Zufferey et Marta Słomka. « Cost-sensitive classifier chains : Selecting low-cost features in multi-label classification ». Pattern Recognition 86 (février 2019) : 290–319. http://dx.doi.org/10.1016/j.patcog.2018.09.012.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
26

Huang, Feng, Yun Liang, Li Huang, Ji Ming Yao et Wen Feng Tian. « Image Classifying Based on Cost-Sensitive Layered Cascade Learning ». Applied Mechanics and Materials 701-702 (décembre 2014) : 453–58. http://dx.doi.org/10.4028/www.scientific.net/amm.701-702.453.

Texte intégral
Résumé :
Image Classification is an important means of image processing, Traditional research of image classification usually based on following assumptions: aiming for the overall classification accuracy, sample of different category has the same importance in data set and all the misclassification brings same cost. Unfortunately, class imbalance and cost sensitive are ubiquitous in classification in real world process, sample size of specific category in data set may much more than others and misclassification cost is sharp distinction between different categories. High dimension of eigenvector caused by diversity content of images and the big complexity gap between distinguish different categories of images are common problems when dealing with image Classification, therefore, one single machine learning algorithms is not sufficient when dealing with complex image classification contains the above characteristics. To cure the above problems, a layered cascade image classifying method based on cost-sensitive and class-imbalance was proposed, a set of cascading learning was build, and the inner patterns of images of specific category was learned in different stages, also, the cost function was introduced, thus, the method can effectively respond to the cost-sensitive and class-imbalance problem of image classifying. Moreover, the structure of this method is flexible as the layer of cascading and the algorithm in every stage can be readjusted based on business requirements of image classifying. The result of application in sensitive image classifying for smart grid indicates that this image classifying based on cost-sensitive layered cascade learning obtains better image classification performance than the existing methods.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Qiang Yang, C. Ling, X. Chai et Rong Pan. « Test-cost sensitive classification on data with missing values ». IEEE Transactions on Knowledge and Data Engineering 18, no 5 (mai 2006) : 626–38. http://dx.doi.org/10.1109/tkde.2006.84.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
28

Duan, Weiwei, et Cheng Ding. « Non-linear Cost-sensitive Decision Tree for Multi-classification ». International Journal of Software Engineering and Its Applications 10, no 2 (28 février 2016) : 217–28. http://dx.doi.org/10.14257/ijseia.2016.10.2.18.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
29

Bernard, Simon, Clément Chatelain, Sébastien Adam et Robert Sabourin. « The Multiclass ROC Front method for cost-sensitive classification ». Pattern Recognition 52 (avril 2016) : 46–60. http://dx.doi.org/10.1016/j.patcog.2015.10.010.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
30

Wang, Junhui. « Boosting the Generalized Margin in Cost-Sensitive Multiclass Classification ». Journal of Computational and Graphical Statistics 22, no 1 (27 décembre 2011) : 178–92. http://dx.doi.org/10.1080/10618600.2011.643151.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
31

Krawczyk, Bartosz, Michał Woźniak et Gerald Schaefer. « Cost-sensitive decision tree ensembles for effective imbalanced classification ». Applied Soft Computing 14 (janvier 2014) : 554–62. http://dx.doi.org/10.1016/j.asoc.2013.08.014.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
32

Zhang, Chong, Kay Chen Tan, Haizhou Li et Geok Soon Hong. « A Cost-Sensitive Deep Belief Network for Imbalanced Classification ». IEEE Transactions on Neural Networks and Learning Systems 30, no 1 (janvier 2019) : 109–22. http://dx.doi.org/10.1109/tnnls.2018.2832648.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
33

Zhou, Jingjing, Weifeng Sun, Xiaomin Han, Ruqiang Lu, Yuanqi Zhang et Shenwei Zhang. « The Research of One Novel Cost-Sensitive Classification Algorithm ». Chinese Journal of Electronics 27, no 5 (1 septembre 2018) : 1015–24. http://dx.doi.org/10.1049/cje.2018.01.002.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
34

Huang, Yuwen. « Dynamic Cost-sensitive Naive Bayes Classification for Uncertain Data ». International Journal of Database Theory and Application 8, no 1 (28 février 2015) : 271–80. http://dx.doi.org/10.14257/ijdta.2015.8.1.26.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
35

Loyola-Gonzalez, Octavio, Jose F. C. O. Martinez-Trinidad, Jesus Ariel Carrasco-Ochoa et Milton Garcia-Borroto. « Cost-Sensitive Pattern-Based classification for Class Imbalance problems ». IEEE Access 7 (2019) : 60411–27. http://dx.doi.org/10.1109/access.2019.2913982.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
36

Zhao, Huimin. « Instance weighting versus threshold adjusting for cost-sensitive classification ». Knowledge and Information Systems 15, no 3 (3 avril 2007) : 321–34. http://dx.doi.org/10.1007/s10115-007-0079-1.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
37

Frumosu, Flavia Dalia, Abdul Rauf Khan, Henrik Schiøler, Murat Kulahci, Mohamed Zaki et Peter Westermann-Rasmussen. « Cost-sensitive learning classification strategy for predicting product failures ». Expert Systems with Applications 161 (décembre 2020) : 113653. http://dx.doi.org/10.1016/j.eswa.2020.113653.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
38

Levering, Ryan, et Michal Cutler. « Cost-Sensitive Feature Extraction and Selection in Genre Classification ». Journal for Language Technology and Computational Linguistics 24, no 1 (1 juillet 2009) : 57–72. http://dx.doi.org/10.21248/jlcl.24.2009.113.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
39

Nakashima, Tomoharu, Yasuyuki Yokota, Hisao Ishibuchi, Gerald Schaefer, Aleš Drastich et Michal Závišek. « Constructing Cost-Sensitive Fuzzy-Rule-Based Systems for Pattern Classification Problems ». Journal of Advanced Computational Intelligence and Intelligent Informatics 11, no 6 (20 juillet 2007) : 546–53. http://dx.doi.org/10.20965/jaciii.2007.p0546.

Texte intégral
Résumé :
We evaluate the performance of cost-sensitive fuzzy-rule-based systems for pattern classification problems. We assume that a misclassification cost is given a priori for each training pattern. The task of classification thus becomes to minimize both classification error and misclassification cost. We examine the performance of two types of fuzzy classification based on fuzzy if-then rules generated from training patterns. The difference is whether or not they consider misclassification costs in rule generation. In our computational experiments, we use several specifications of misclassification cost to evaluate the performance of the two classifiers. Experimental results show that both classification error and misclassification cost are reduced by considering the misclassification cost in fuzzy rule generation.
Styles APA, Harvard, Vancouver, ISO, etc.
40

WANG, ZHE, MINGZHE LU, ZENGXIN NIU, XIANGYANG XUE et DAQI GAO. « COST-SENSITIVE MULTI-VIEW LEARNING MACHINE ». International Journal of Pattern Recognition and Artificial Intelligence 28, no 03 (mai 2014) : 1451004. http://dx.doi.org/10.1142/s0218001414510045.

Texte intégral
Résumé :
Multi-view learning aims to effectively learn from data represented by multiple independent sets of attributes, where each set is taken as one view of the original data. In real-world application, each view should be acquired in unequal cost. Taking web-page classification for example, it is cheaper to get the words on itself (view one) than to get the words contained in anchor texts of inbound hyper-links (view two). However, almost all the existing multi-view learning does not consider the cost of acquiring the views or the cost of evaluating them. In this paper, we support that different views should adopt different representations and lead to different acquisition cost. Thus we develop a new view-dependent cost different from the existing both class-dependent cost and example-dependent cost. To this end, we generalize the framework of multi-view learning with the cost-sensitive technique and further propose a Cost-sensitive Multi-View Learning Machine named CMVLM for short. In implementation, we take into account and measure both the acquisition cost and the discriminant scatter of each view. Then through eliminating the useless views with a predefined threshold, we use the reserved views to train the final classifier. The experimental results on a broad range of data sets including the benchmark UCI, image, and bioinformatics data sets validate that the proposed algorithm can effectively reduce the total cost and have a competitive even better classification performance. The contributions of this paper are that: (1) first proposing a view-dependent cost; (2) establishing a cost-sensitive multi-view learning framework; (3) developing a wrapper technique that is universal to most multiple kernel based classifier.
Styles APA, Harvard, Vancouver, ISO, etc.
41

Kim, Jungeun, Keunho Choi, Gunwoo Kim et Yongmoo Suh. « Classification cost : An empirical comparison among traditional classifier, Cost-Sensitive Classifier, and MetaCost ». Expert Systems with Applications 39, no 4 (mars 2012) : 4013–19. http://dx.doi.org/10.1016/j.eswa.2011.09.071.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
42

Sharifnia, Ensieh, et Reza Boostani. « Instance-Based Cost-Sensitive Boosting ». International Journal of Pattern Recognition and Artificial Intelligence 34, no 03 (22 juillet 2019) : 2050002. http://dx.doi.org/10.1142/s0218001420500020.

Texte intégral
Résumé :
Many classification algorithms aim to minimize just their training error count; however, it is often desirable to minimize a more general cost metric, where distinct instances have different costs. In this paper, an instance-based cost-sensitive Bayesian consistent version of exponential loss function is proposed. Using the modified loss function, the derivation of instance-based cost-sensitive extensions of AdaBoost, RealBoost and GentleBoost are developed which are termed as ICSAdaBoost, ICSRealBoost and ICSGentleBoost, respectively. In this research, a new instance-based cost generation method is proposed instead of doing this expensive process by experts. Thus, each sample takes two cost values; a class cost and a sample cost. The first cost is equally assigned to all samples of each class while the second cost is generated according to the probability of each sample within its class probability density function. Experimental results of the proposed schemes imply 12% enhancement in terms of [Formula: see text]-measure and 13% on cost-per-sample over a variety of UCI datasets, compared to the state-of-the-art methods. The significant priority of the proposed method is supported by applying the pair of [Formula: see text]-tests to the results.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Xue, Aijun, et Xiaodan Wang. « Cost-sensitive design of error correcting output codes ». Proceedings of the Institution of Mechanical Engineers, Part C : Journal of Mechanical Engineering Science 232, no 10 (12 mai 2017) : 1871–81. http://dx.doi.org/10.1177/0954406217709303.

Texte intégral
Résumé :
Many real world applications involve multiclass cost-sensitive learning problems. However, some well-worked binary cost-sensitive learning algorithms cannot be extended into multiclass cost-sensitive learning directly. It is meaningful to decompose the complex multiclass cost-sensitive classification problem into a series of binary cost-sensitive classification problems. So, in this paper we propose an alternative and efficient decomposition framework, using the original error correcting output codes. The main problem in our framework is how to evaluate the binary costs for each binary cost-sensitive base classifier. To solve this problem, we proposed to compute the expected misclassification costs starting from the given multiclass cost matrix. Furthermore, the general formulations to compute the binary costs are given. Experimental results on several synthetic and UCI datasets show that our method can obtain comparable performance in comparison with the state-of-the-art methods.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Bei, Honghan, Yajie Wang, Zhaonuo Ren, Shuo Jiang, Keran Li et Wenyang Wang. « A Statistical Approach to Cost-Sensitive AdaBoost for Imbalanced Data Classification ». Mathematical Problems in Engineering 2021 (23 octobre 2021) : 1–20. http://dx.doi.org/10.1155/2021/3165589.

Texte intégral
Résumé :
To address the imbalanced data problem in classification, the studies of the combination of AdaBoost, short for “Adaptive Boosting,” and cost-sensitive learning have shown convincing results in the literature. The cost-sensitive AdaBoost algorithms are practical since the “boosting” property in AdaBoost can iteratively enhance the small class of the cost-sensitive learning to solve the imbalanced data issue. However, the most available cost-sensitive AdaBoost algorithms are heuristic approaches, which are improved from the standard AdaBoost algorithm by cost-sensitively adjusting the voting weight parameters of weak classifiers or the sample updating weight parameters without strict theoretic proof. The algorithms are appended the cost-sensitive factors to focus on the high-cost and small-class samples, but they have no procedures to show the best place to add the cost factors and the cost factor value set. To complete the cost-sensitive AdaBoost algorithms’ framework, the present article has two main contributions. First, we summarize the popular cost-sensitive boosting algorithms in the literature and propose a generally comprehensive form. We name our specific one, the “AdaImC algorithm,” which is typically appliable to solve the imbalanced data classification problem with theoretic proof. Second, a statistical approach to prove the AdaImC algorithm is proposed to verify the inner relationship between the cost parameters. We show that our proposed algorithm in the machine learning field is identical to the Product of Experts (PoE) model in the statistics field. Besides, a way to determine the cost parameter value by the statistical analysis is introduced. Several numeric studies are listed finally to support our proposed algorithm.
Styles APA, Harvard, Vancouver, ISO, etc.
45

Xiong, Yueling, Mingquan Ye et Changrong Wu. « Cancer Classification with a Cost-Sensitive Naive Bayes Stacking Ensemble ». Computational and Mathematical Methods in Medicine 2021 (26 avril 2021) : 1–12. http://dx.doi.org/10.1155/2021/5556992.

Texte intégral
Résumé :
Ensemble learning combines multiple learners to perform combinatorial learning, which has advantages of good flexibility and higher generalization performance. To achieve higher quality cancer classification, in this study, the fast correlation-based feature selection (FCBF) method was used to preprocess the data to eliminate irrelevant and redundant features. Then, the classification was carried out in the stacking ensemble learner. A library for support vector machine (LIBSVM), K -nearest neighbor (KNN), decision tree C4.5 (C4.5), and random forest (RF) were used as the primary learners of the stacking ensemble. Given the imbalanced characteristics of cancer gene expression data, the embedding cost-sensitive naive Bayes was used as the metalearner of the stacking ensemble, which was represented as CSNB stacking. The proposed CSNB stacking method was applied to nine cancer datasets to further verify the classification performance of the model. Compared with other classification methods, such as single classifier algorithms and ensemble algorithms, the experimental results showed the effectiveness and robustness of the proposed method in processing different types of cancer data. This method may therefore help guide cancer diagnosis and research.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Lin, Hsuan-Tien, et Ling Li. « Reduction from Cost-Sensitive Ordinal Ranking to Weighted Binary Classification ». Neural Computation 24, no 5 (mai 2012) : 1329–67. http://dx.doi.org/10.1162/neco_a_00265.

Texte intégral
Résumé :
We present a reduction framework from ordinal ranking to binary classification. The framework consists of three steps: extracting extended examples from the original examples, learning a binary classifier on the extended examples with any binary classification algorithm, and constructing a ranker from the binary classifier. Based on the framework, we show that a weighted 0/1 loss of the binary classifier upper-bounds the mislabeling cost of the ranker, both error-wise and regret-wise. Our framework allows not only the design of good ordinal ranking algorithms based on well-tuned binary classification approaches, but also the derivation of new generalization bounds for ordinal ranking from known bounds for binary classification. In addition, our framework unifies many existing ordinal ranking algorithms, such as perceptron ranking and support vector ordinal regression. When compared empirically on benchmark data sets, some of our newly designed algorithms enjoy advantages in terms of both training speed and generalization performance over existing algorithms. In addition, the newly designed algorithms lead to better cost-sensitive ordinal ranking performance, as well as improved listwise ranking performance.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Thakkar, Hiren Kumar, Ankit Desai, Subrata Ghosh, Priyanka Singh et Gajendra Sharma. « Clairvoyant : AdaBoost with Cost-Enabled Cost-Sensitive Classifier for Customer Churn Prediction ». Computational Intelligence and Neuroscience 2022 (22 janvier 2022) : 1–11. http://dx.doi.org/10.1155/2022/9028580.

Texte intégral
Résumé :
Customer churn prediction is one of the challenging problems and paramount concerns for telecommunication industries. With the increasing number of mobile operators, users can switch from one mobile operator to another if they are unsatisfied with the service. Marketing literature states that it costs 5–10 times more to acquire a new customer than retain an existing one. Hence, effective customer churn management has become a crucial demand for mobile communication operators. Researchers have proposed several classifiers and boosting methods to control customer churn rate, including deep learning (DL) algorithms. However, conventional classification algorithms follow an error-based framework that focuses on improving the classifier’s accuracy over cost sensitization. Typical classification algorithms treat misclassification errors equally, which is not applicable in practice. On the contrary, DL algorithms are computationally expensive as well as time-consuming. In this paper, a novel class-dependent cost-sensitive boosting algorithm called AdaBoostWithCost is proposed to reduce the churn cost. This study demonstrates the empirical evaluation of the proposed AdaBoostWithCost algorithm, which consistently outperforms the discrete AdaBoost algorithm concerning telecom churn prediction. The key focus of the AdaBoostWithCost classifier is to reduce false-negative error and the misclassification cost more significantly than the AdaBoost.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Malhotra, R., et J. Jain. « Predicting defects in object-oriented software using cost-sensitive classification ». IOP Conference Series : Materials Science and Engineering 1022 (19 janvier 2021) : 012112. http://dx.doi.org/10.1088/1757-899x/1022/1/012112.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
49

Geng, Yue, et Xinyu Luo. « Cost-sensitive convolutional neural networks for imbalanced time series classification ». Intelligent Data Analysis 23, no 2 (4 avril 2019) : 357–70. http://dx.doi.org/10.3233/ida-183831.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
50

Ruan, Yu-Xun, Hsuan-Tien Lin et Ming-Feng Tsai. « Improving ranking performance with cost-sensitive ordinal classification via regression ». Information Retrieval 17, no 1 (8 février 2013) : 1–20. http://dx.doi.org/10.1007/s10791-013-9219-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie