To see the other types of publications on this topic, follow the link: Cost-sensitive classification.

Journal articles on the topic 'Cost-sensitive classification'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Cost-sensitive classification.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wang, Jialei, Peilin Zhao, and Steven C. H. Hoi. "Cost-Sensitive Online Classification." IEEE Transactions on Knowledge and Data Engineering 26, no. 10 (October 2014): 2425–38. http://dx.doi.org/10.1109/tkde.2013.157.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Shichao. "Cost-sensitive KNN classification." Neurocomputing 391 (May 2020): 234–42. http://dx.doi.org/10.1016/j.neucom.2018.11.101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, Peilin, Yifan Zhang, Min Wu, Steven C. H. Hoi, Mingkui Tan, and Junzhou Huang. "Adaptive Cost-Sensitive Online Classification." IEEE Transactions on Knowledge and Data Engineering 31, no. 2 (February 1, 2019): 214–28. http://dx.doi.org/10.1109/tkde.2018.2826011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cebe, Mumin, and Cigdem Gunduz-Demir. "Qualitative test-cost sensitive classification." Pattern Recognition Letters 31, no. 13 (October 2010): 2043–51. http://dx.doi.org/10.1016/j.patrec.2010.05.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Shichao. "Cost-sensitive classification with respect to waiting cost." Knowledge-Based Systems 23, no. 5 (July 2010): 369–78. http://dx.doi.org/10.1016/j.knosys.2010.01.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pendharkar, Parag C. "Linear models for cost-sensitive classification." Expert Systems 32, no. 5 (June 5, 2015): 622–36. http://dx.doi.org/10.1111/exsy.12114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ji, Shihao, and Lawrence Carin. "Cost-sensitive feature acquisition and classification." Pattern Recognition 40, no. 5 (May 2007): 1474–85. http://dx.doi.org/10.1016/j.patcog.2006.11.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yang, Yi, Yuxuan Guo, and Xiangyu Chang. "Angle-based cost-sensitive multicategory classification." Computational Statistics & Data Analysis 156 (April 2021): 107107. http://dx.doi.org/10.1016/j.csda.2020.107107.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tapkan, Pınar, Lale Özbakır, Sinem Kulluk, and Adil Baykasoğlu. "A cost-sensitive classification algorithm: BEE-Miner." Knowledge-Based Systems 95 (March 2016): 99–113. http://dx.doi.org/10.1016/j.knosys.2015.12.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Tao, Zhenxing Qin, Shichao Zhang, and Chengqi Zhang. "Cost-sensitive classification with inadequate labeled data." Information Systems 37, no. 5 (July 2012): 508–16. http://dx.doi.org/10.1016/j.is.2011.10.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Shi, Yinghuan, Yang Gao, Ruili Wang, Ying Zhang, and Dong Wang. "Transductive cost-sensitive lung cancer image classification." Applied Intelligence 38, no. 1 (May 17, 2012): 16–28. http://dx.doi.org/10.1007/s10489-012-0354-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Zheng, Weijie, and Hong Zhao. "Cost-sensitive hierarchical classification for imbalance classes." Applied Intelligence 50, no. 8 (March 4, 2020): 2328–38. http://dx.doi.org/10.1007/s10489-019-01624-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Desai, Ankit, and Sanjay Chaudhary. "Distributed AdaBoost Extensions for Cost-sensitive Classification Problems." International Journal of Computer Applications 177, no. 12 (October 17, 2019): 1–8. http://dx.doi.org/10.5120/ijca2019919531.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Wang, Zhe, Xu Chu, Dongdong Li, Hai Yang, and Weichao Qu. "Cost-sensitive matrixized classification learning with information entropy." Applied Soft Computing 116 (February 2022): 108266. http://dx.doi.org/10.1016/j.asoc.2021.108266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Miao, Yilin, Zhewei Liu, Xiangning Wu, and Jie Gao. "Cost-Sensitive Siamese Network for PCB Defect Classification." Computational Intelligence and Neuroscience 2021 (October 12, 2021): 1–13. http://dx.doi.org/10.1155/2021/7550670.

Full text
Abstract:
After the production of printed circuit boards (PCB), PCB manufacturers need to remove defected boards by conducting rigorous testing, while manual inspection is time-consuming and laborious. Many PCB factories employ automatic optical inspection (AOI), but this pixel-based comparison method has a high false alarm rate, thus requiring intensive human inspection to determine whether alarms raised from it resemble true or pseudo defects. In this paper, we propose a new cost-sensitive deep learning model: cost-sensitive siamese network (CSS-Net) based on siamese network, transfer learning and threshold moving methods to distinguish between true and pseudo PCB defects as a cost-sensitive classification problem. We use optimization algorithms such as NSGA-II to determine the optimal cost-sensitive threshold. Results show that our model improves true defects prediction accuracy to 97.60%, and it maintains relatively high pseudo defect prediction accuracy, 61.24% in real-production scenario. Furthermore, our model also outperforms its state-of-the-art competitor models in other comprehensive cost-sensitive metrics, with an average of 33.32% shorter training time.
APA, Harvard, Vancouver, ISO, and other styles
16

Shin, Chang-Uk, Jinyoung Oh, and Jeong-Won Cha. "Dynamic Cost Sensitive Learning for Imbalanced Text Classification." KIISE Transactions on Computing Practices 26, no. 4 (April 30, 2020): 211–16. http://dx.doi.org/10.5626/ktcp.2020.26.4.211.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Pan, Shirui, Jia Wu, and Xingquan Zhu. "CogBoost: Boosting for Fast Cost-Sensitive Graph Classification." IEEE Transactions on Knowledge and Data Engineering 27, no. 11 (November 1, 2015): 2933–46. http://dx.doi.org/10.1109/tkde.2015.2391115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Zhang, Minghui, Haiwei Pan, Niu Zhang, Xiaoqin Xie, Zhiqiang Zhang, and Xiaoning Feng. "Cost-sensitive ensemble classification algorithm for medical image." International Journal of Computational Science and Engineering 16, no. 3 (2018): 282. http://dx.doi.org/10.1504/ijcse.2018.091763.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Zhang, Zhiqiang, Xiaoning Feng, Xiaoqin Xie, Minghui Zhang, Haiwei Pan, and Niu Zhang. "Cost-sensitive ensemble classification algorithm for medical image." International Journal of Computational Science and Engineering 16, no. 3 (2018): 282. http://dx.doi.org/10.1504/ijcse.2018.10012835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Scott, Clayton, and Mark Davenport. "Regression Level Set Estimation Via Cost-Sensitive Classification." IEEE Transactions on Signal Processing 55, no. 6 (June 2007): 2752–57. http://dx.doi.org/10.1109/tsp.2007.893758.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Sun, Yanmin, Mohamed S. Kamel, Andrew K. C. Wong, and Yang Wang. "Cost-sensitive boosting for classification of imbalanced data." Pattern Recognition 40, no. 12 (December 2007): 3358–78. http://dx.doi.org/10.1016/j.patcog.2007.04.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Huang, Kuan-Hao, and Hsuan-Tien Lin. "Cost-sensitive label embedding for multi-label classification." Machine Learning 106, no. 9-10 (August 2, 2017): 1725–46. http://dx.doi.org/10.1007/s10994-017-5659-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Zidelmal, Z., A. Amirou, D. Ould-Abdeslam, and J. Merckle. "ECG beat classification using a cost sensitive classifier." Computer Methods and Programs in Biomedicine 111, no. 3 (September 2013): 570–77. http://dx.doi.org/10.1016/j.cmpb.2013.05.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Liu, Fen, and Quan Qian. "Cost-Sensitive Variational Autoencoding Classifier for Imbalanced Data Classification." Algorithms 15, no. 5 (April 21, 2022): 139. http://dx.doi.org/10.3390/a15050139.

Full text
Abstract:
Classification is among the core tasks in machine learning. Existing classification algorithms are typically based on the assumption of at least roughly balanced data classes. When performing tasks involving imbalanced data, such classifiers ignore the minority data in consideration of the overall accuracy. The performance of traditional classification algorithms based on the assumption of balanced data distribution is insufficient because the minority-class samples are often more important than others, such as positive samples, in disease diagnosis. In this study, we propose a cost-sensitive variational autoencoding classifier that combines data-level and algorithm-level methods to solve the problem of imbalanced data classification. Cost-sensitive factors are introduced to assign a high cost to the misclassification of minority data, which biases the classifier toward minority data. We also designed misclassification costs closely related to tasks by embedding domain knowledge. Experimental results show that the proposed method performed the classification of bulk amorphous materials well.
APA, Harvard, Vancouver, ISO, and other styles
25

Teisseyre, Paweł, Damien Zufferey, and Marta Słomka. "Cost-sensitive classifier chains: Selecting low-cost features in multi-label classification." Pattern Recognition 86 (February 2019): 290–319. http://dx.doi.org/10.1016/j.patcog.2018.09.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Huang, Feng, Yun Liang, Li Huang, Ji Ming Yao, and Wen Feng Tian. "Image Classifying Based on Cost-Sensitive Layered Cascade Learning." Applied Mechanics and Materials 701-702 (December 2014): 453–58. http://dx.doi.org/10.4028/www.scientific.net/amm.701-702.453.

Full text
Abstract:
Image Classification is an important means of image processing, Traditional research of image classification usually based on following assumptions: aiming for the overall classification accuracy, sample of different category has the same importance in data set and all the misclassification brings same cost. Unfortunately, class imbalance and cost sensitive are ubiquitous in classification in real world process, sample size of specific category in data set may much more than others and misclassification cost is sharp distinction between different categories. High dimension of eigenvector caused by diversity content of images and the big complexity gap between distinguish different categories of images are common problems when dealing with image Classification, therefore, one single machine learning algorithms is not sufficient when dealing with complex image classification contains the above characteristics. To cure the above problems, a layered cascade image classifying method based on cost-sensitive and class-imbalance was proposed, a set of cascading learning was build, and the inner patterns of images of specific category was learned in different stages, also, the cost function was introduced, thus, the method can effectively respond to the cost-sensitive and class-imbalance problem of image classifying. Moreover, the structure of this method is flexible as the layer of cascading and the algorithm in every stage can be readjusted based on business requirements of image classifying. The result of application in sensitive image classifying for smart grid indicates that this image classifying based on cost-sensitive layered cascade learning obtains better image classification performance than the existing methods.
APA, Harvard, Vancouver, ISO, and other styles
27

Qiang Yang, C. Ling, X. Chai, and Rong Pan. "Test-cost sensitive classification on data with missing values." IEEE Transactions on Knowledge and Data Engineering 18, no. 5 (May 2006): 626–38. http://dx.doi.org/10.1109/tkde.2006.84.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Duan, Weiwei, and Cheng Ding. "Non-linear Cost-sensitive Decision Tree for Multi-classification." International Journal of Software Engineering and Its Applications 10, no. 2 (February 28, 2016): 217–28. http://dx.doi.org/10.14257/ijseia.2016.10.2.18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Bernard, Simon, Clément Chatelain, Sébastien Adam, and Robert Sabourin. "The Multiclass ROC Front method for cost-sensitive classification." Pattern Recognition 52 (April 2016): 46–60. http://dx.doi.org/10.1016/j.patcog.2015.10.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Wang, Junhui. "Boosting the Generalized Margin in Cost-Sensitive Multiclass Classification." Journal of Computational and Graphical Statistics 22, no. 1 (December 27, 2011): 178–92. http://dx.doi.org/10.1080/10618600.2011.643151.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Krawczyk, Bartosz, Michał Woźniak, and Gerald Schaefer. "Cost-sensitive decision tree ensembles for effective imbalanced classification." Applied Soft Computing 14 (January 2014): 554–62. http://dx.doi.org/10.1016/j.asoc.2013.08.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Zhang, Chong, Kay Chen Tan, Haizhou Li, and Geok Soon Hong. "A Cost-Sensitive Deep Belief Network for Imbalanced Classification." IEEE Transactions on Neural Networks and Learning Systems 30, no. 1 (January 2019): 109–22. http://dx.doi.org/10.1109/tnnls.2018.2832648.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Zhou, Jingjing, Weifeng Sun, Xiaomin Han, Ruqiang Lu, Yuanqi Zhang, and Shenwei Zhang. "The Research of One Novel Cost-Sensitive Classification Algorithm." Chinese Journal of Electronics 27, no. 5 (September 1, 2018): 1015–24. http://dx.doi.org/10.1049/cje.2018.01.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Huang, Yuwen. "Dynamic Cost-sensitive Naive Bayes Classification for Uncertain Data." International Journal of Database Theory and Application 8, no. 1 (February 28, 2015): 271–80. http://dx.doi.org/10.14257/ijdta.2015.8.1.26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Loyola-Gonzalez, Octavio, Jose F. C. O. Martinez-Trinidad, Jesus Ariel Carrasco-Ochoa, and Milton Garcia-Borroto. "Cost-Sensitive Pattern-Based classification for Class Imbalance problems." IEEE Access 7 (2019): 60411–27. http://dx.doi.org/10.1109/access.2019.2913982.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Zhao, Huimin. "Instance weighting versus threshold adjusting for cost-sensitive classification." Knowledge and Information Systems 15, no. 3 (April 3, 2007): 321–34. http://dx.doi.org/10.1007/s10115-007-0079-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Frumosu, Flavia Dalia, Abdul Rauf Khan, Henrik Schiøler, Murat Kulahci, Mohamed Zaki, and Peter Westermann-Rasmussen. "Cost-sensitive learning classification strategy for predicting product failures." Expert Systems with Applications 161 (December 2020): 113653. http://dx.doi.org/10.1016/j.eswa.2020.113653.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Levering, Ryan, and Michal Cutler. "Cost-Sensitive Feature Extraction and Selection in Genre Classification." Journal for Language Technology and Computational Linguistics 24, no. 1 (July 1, 2009): 57–72. http://dx.doi.org/10.21248/jlcl.24.2009.113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Nakashima, Tomoharu, Yasuyuki Yokota, Hisao Ishibuchi, Gerald Schaefer, Aleš Drastich, and Michal Závišek. "Constructing Cost-Sensitive Fuzzy-Rule-Based Systems for Pattern Classification Problems." Journal of Advanced Computational Intelligence and Intelligent Informatics 11, no. 6 (July 20, 2007): 546–53. http://dx.doi.org/10.20965/jaciii.2007.p0546.

Full text
Abstract:
We evaluate the performance of cost-sensitive fuzzy-rule-based systems for pattern classification problems. We assume that a misclassification cost is given a priori for each training pattern. The task of classification thus becomes to minimize both classification error and misclassification cost. We examine the performance of two types of fuzzy classification based on fuzzy if-then rules generated from training patterns. The difference is whether or not they consider misclassification costs in rule generation. In our computational experiments, we use several specifications of misclassification cost to evaluate the performance of the two classifiers. Experimental results show that both classification error and misclassification cost are reduced by considering the misclassification cost in fuzzy rule generation.
APA, Harvard, Vancouver, ISO, and other styles
40

WANG, ZHE, MINGZHE LU, ZENGXIN NIU, XIANGYANG XUE, and DAQI GAO. "COST-SENSITIVE MULTI-VIEW LEARNING MACHINE." International Journal of Pattern Recognition and Artificial Intelligence 28, no. 03 (May 2014): 1451004. http://dx.doi.org/10.1142/s0218001414510045.

Full text
Abstract:
Multi-view learning aims to effectively learn from data represented by multiple independent sets of attributes, where each set is taken as one view of the original data. In real-world application, each view should be acquired in unequal cost. Taking web-page classification for example, it is cheaper to get the words on itself (view one) than to get the words contained in anchor texts of inbound hyper-links (view two). However, almost all the existing multi-view learning does not consider the cost of acquiring the views or the cost of evaluating them. In this paper, we support that different views should adopt different representations and lead to different acquisition cost. Thus we develop a new view-dependent cost different from the existing both class-dependent cost and example-dependent cost. To this end, we generalize the framework of multi-view learning with the cost-sensitive technique and further propose a Cost-sensitive Multi-View Learning Machine named CMVLM for short. In implementation, we take into account and measure both the acquisition cost and the discriminant scatter of each view. Then through eliminating the useless views with a predefined threshold, we use the reserved views to train the final classifier. The experimental results on a broad range of data sets including the benchmark UCI, image, and bioinformatics data sets validate that the proposed algorithm can effectively reduce the total cost and have a competitive even better classification performance. The contributions of this paper are that: (1) first proposing a view-dependent cost; (2) establishing a cost-sensitive multi-view learning framework; (3) developing a wrapper technique that is universal to most multiple kernel based classifier.
APA, Harvard, Vancouver, ISO, and other styles
41

Kim, Jungeun, Keunho Choi, Gunwoo Kim, and Yongmoo Suh. "Classification cost: An empirical comparison among traditional classifier, Cost-Sensitive Classifier, and MetaCost." Expert Systems with Applications 39, no. 4 (March 2012): 4013–19. http://dx.doi.org/10.1016/j.eswa.2011.09.071.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Sharifnia, Ensieh, and Reza Boostani. "Instance-Based Cost-Sensitive Boosting." International Journal of Pattern Recognition and Artificial Intelligence 34, no. 03 (July 22, 2019): 2050002. http://dx.doi.org/10.1142/s0218001420500020.

Full text
Abstract:
Many classification algorithms aim to minimize just their training error count; however, it is often desirable to minimize a more general cost metric, where distinct instances have different costs. In this paper, an instance-based cost-sensitive Bayesian consistent version of exponential loss function is proposed. Using the modified loss function, the derivation of instance-based cost-sensitive extensions of AdaBoost, RealBoost and GentleBoost are developed which are termed as ICSAdaBoost, ICSRealBoost and ICSGentleBoost, respectively. In this research, a new instance-based cost generation method is proposed instead of doing this expensive process by experts. Thus, each sample takes two cost values; a class cost and a sample cost. The first cost is equally assigned to all samples of each class while the second cost is generated according to the probability of each sample within its class probability density function. Experimental results of the proposed schemes imply 12% enhancement in terms of [Formula: see text]-measure and 13% on cost-per-sample over a variety of UCI datasets, compared to the state-of-the-art methods. The significant priority of the proposed method is supported by applying the pair of [Formula: see text]-tests to the results.
APA, Harvard, Vancouver, ISO, and other styles
43

Xue, Aijun, and Xiaodan Wang. "Cost-sensitive design of error correcting output codes." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 232, no. 10 (May 12, 2017): 1871–81. http://dx.doi.org/10.1177/0954406217709303.

Full text
Abstract:
Many real world applications involve multiclass cost-sensitive learning problems. However, some well-worked binary cost-sensitive learning algorithms cannot be extended into multiclass cost-sensitive learning directly. It is meaningful to decompose the complex multiclass cost-sensitive classification problem into a series of binary cost-sensitive classification problems. So, in this paper we propose an alternative and efficient decomposition framework, using the original error correcting output codes. The main problem in our framework is how to evaluate the binary costs for each binary cost-sensitive base classifier. To solve this problem, we proposed to compute the expected misclassification costs starting from the given multiclass cost matrix. Furthermore, the general formulations to compute the binary costs are given. Experimental results on several synthetic and UCI datasets show that our method can obtain comparable performance in comparison with the state-of-the-art methods.
APA, Harvard, Vancouver, ISO, and other styles
44

Bei, Honghan, Yajie Wang, Zhaonuo Ren, Shuo Jiang, Keran Li, and Wenyang Wang. "A Statistical Approach to Cost-Sensitive AdaBoost for Imbalanced Data Classification." Mathematical Problems in Engineering 2021 (October 23, 2021): 1–20. http://dx.doi.org/10.1155/2021/3165589.

Full text
Abstract:
To address the imbalanced data problem in classification, the studies of the combination of AdaBoost, short for “Adaptive Boosting,” and cost-sensitive learning have shown convincing results in the literature. The cost-sensitive AdaBoost algorithms are practical since the “boosting” property in AdaBoost can iteratively enhance the small class of the cost-sensitive learning to solve the imbalanced data issue. However, the most available cost-sensitive AdaBoost algorithms are heuristic approaches, which are improved from the standard AdaBoost algorithm by cost-sensitively adjusting the voting weight parameters of weak classifiers or the sample updating weight parameters without strict theoretic proof. The algorithms are appended the cost-sensitive factors to focus on the high-cost and small-class samples, but they have no procedures to show the best place to add the cost factors and the cost factor value set. To complete the cost-sensitive AdaBoost algorithms’ framework, the present article has two main contributions. First, we summarize the popular cost-sensitive boosting algorithms in the literature and propose a generally comprehensive form. We name our specific one, the “AdaImC algorithm,” which is typically appliable to solve the imbalanced data classification problem with theoretic proof. Second, a statistical approach to prove the AdaImC algorithm is proposed to verify the inner relationship between the cost parameters. We show that our proposed algorithm in the machine learning field is identical to the Product of Experts (PoE) model in the statistics field. Besides, a way to determine the cost parameter value by the statistical analysis is introduced. Several numeric studies are listed finally to support our proposed algorithm.
APA, Harvard, Vancouver, ISO, and other styles
45

Xiong, Yueling, Mingquan Ye, and Changrong Wu. "Cancer Classification with a Cost-Sensitive Naive Bayes Stacking Ensemble." Computational and Mathematical Methods in Medicine 2021 (April 26, 2021): 1–12. http://dx.doi.org/10.1155/2021/5556992.

Full text
Abstract:
Ensemble learning combines multiple learners to perform combinatorial learning, which has advantages of good flexibility and higher generalization performance. To achieve higher quality cancer classification, in this study, the fast correlation-based feature selection (FCBF) method was used to preprocess the data to eliminate irrelevant and redundant features. Then, the classification was carried out in the stacking ensemble learner. A library for support vector machine (LIBSVM), K -nearest neighbor (KNN), decision tree C4.5 (C4.5), and random forest (RF) were used as the primary learners of the stacking ensemble. Given the imbalanced characteristics of cancer gene expression data, the embedding cost-sensitive naive Bayes was used as the metalearner of the stacking ensemble, which was represented as CSNB stacking. The proposed CSNB stacking method was applied to nine cancer datasets to further verify the classification performance of the model. Compared with other classification methods, such as single classifier algorithms and ensemble algorithms, the experimental results showed the effectiveness and robustness of the proposed method in processing different types of cancer data. This method may therefore help guide cancer diagnosis and research.
APA, Harvard, Vancouver, ISO, and other styles
46

Lin, Hsuan-Tien, and Ling Li. "Reduction from Cost-Sensitive Ordinal Ranking to Weighted Binary Classification." Neural Computation 24, no. 5 (May 2012): 1329–67. http://dx.doi.org/10.1162/neco_a_00265.

Full text
Abstract:
We present a reduction framework from ordinal ranking to binary classification. The framework consists of three steps: extracting extended examples from the original examples, learning a binary classifier on the extended examples with any binary classification algorithm, and constructing a ranker from the binary classifier. Based on the framework, we show that a weighted 0/1 loss of the binary classifier upper-bounds the mislabeling cost of the ranker, both error-wise and regret-wise. Our framework allows not only the design of good ordinal ranking algorithms based on well-tuned binary classification approaches, but also the derivation of new generalization bounds for ordinal ranking from known bounds for binary classification. In addition, our framework unifies many existing ordinal ranking algorithms, such as perceptron ranking and support vector ordinal regression. When compared empirically on benchmark data sets, some of our newly designed algorithms enjoy advantages in terms of both training speed and generalization performance over existing algorithms. In addition, the newly designed algorithms lead to better cost-sensitive ordinal ranking performance, as well as improved listwise ranking performance.
APA, Harvard, Vancouver, ISO, and other styles
47

Thakkar, Hiren Kumar, Ankit Desai, Subrata Ghosh, Priyanka Singh, and Gajendra Sharma. "Clairvoyant: AdaBoost with Cost-Enabled Cost-Sensitive Classifier for Customer Churn Prediction." Computational Intelligence and Neuroscience 2022 (January 22, 2022): 1–11. http://dx.doi.org/10.1155/2022/9028580.

Full text
Abstract:
Customer churn prediction is one of the challenging problems and paramount concerns for telecommunication industries. With the increasing number of mobile operators, users can switch from one mobile operator to another if they are unsatisfied with the service. Marketing literature states that it costs 5–10 times more to acquire a new customer than retain an existing one. Hence, effective customer churn management has become a crucial demand for mobile communication operators. Researchers have proposed several classifiers and boosting methods to control customer churn rate, including deep learning (DL) algorithms. However, conventional classification algorithms follow an error-based framework that focuses on improving the classifier’s accuracy over cost sensitization. Typical classification algorithms treat misclassification errors equally, which is not applicable in practice. On the contrary, DL algorithms are computationally expensive as well as time-consuming. In this paper, a novel class-dependent cost-sensitive boosting algorithm called AdaBoostWithCost is proposed to reduce the churn cost. This study demonstrates the empirical evaluation of the proposed AdaBoostWithCost algorithm, which consistently outperforms the discrete AdaBoost algorithm concerning telecom churn prediction. The key focus of the AdaBoostWithCost classifier is to reduce false-negative error and the misclassification cost more significantly than the AdaBoost.
APA, Harvard, Vancouver, ISO, and other styles
48

Malhotra, R., and J. Jain. "Predicting defects in object-oriented software using cost-sensitive classification." IOP Conference Series: Materials Science and Engineering 1022 (January 19, 2021): 012112. http://dx.doi.org/10.1088/1757-899x/1022/1/012112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Geng, Yue, and Xinyu Luo. "Cost-sensitive convolutional neural networks for imbalanced time series classification." Intelligent Data Analysis 23, no. 2 (April 4, 2019): 357–70. http://dx.doi.org/10.3233/ida-183831.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Ruan, Yu-Xun, Hsuan-Tien Lin, and Ming-Feng Tsai. "Improving ranking performance with cost-sensitive ordinal classification via regression." Information Retrieval 17, no. 1 (February 8, 2013): 1–20. http://dx.doi.org/10.1007/s10791-013-9219-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography