To see the other types of publications on this topic, follow the link: Bayesian network classifier.

Journal articles on the topic 'Bayesian network classifier'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Bayesian network classifier.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Pernkopf, Franz. "Bayesian network classifiers versus selective -NN classifier." Pattern Recognition 38, no. 1 (2005): 1–10. http://dx.doi.org/10.1016/j.patcog.2004.05.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

DE STEFANO, CLAUDIO, CIRO D'ELIA, ALESSANDRA SCOTTO DI FRECA, and ANGELO MARCELLI. "CLASSIFIER COMBINATION BY BAYESIAN NETWORKS FOR HANDWRITING RECOGNITION." International Journal of Pattern Recognition and Artificial Intelligence 23, no. 05 (2009): 887–905. http://dx.doi.org/10.1142/s0218001409007387.

Full text
Abstract:
In the field of handwriting recognition, classifier combination received much more interest than the study of powerful individual classifiers. This is mainly due to the enormous variability among the patterns to be classified, that typically requires the definition of complex high dimensional feature spaces: as the overall complexity increases, the risk of inconsistency in the decision of the classifier increases as well. In this framework, we propose a new combining method based on the use of a Bayesian Network. In particular, we suggest to reformulate the classifier combination problem as a
APA, Harvard, Vancouver, ISO, and other styles
3

Abhinaya, P. M., and V. Nivethitha. "Detection of novel attacks by anomaly intrusion detection system using classifiers." International Journal of Engineering & Technology 7, no. 1.7 (2018): 54. http://dx.doi.org/10.14419/ijet.v7i1.7.9571.

Full text
Abstract:
Nowadays analyzing unsuspicious network traffic has become a necessity to protect organizations from intruders. Really it is a big challenge to accurately identify threats due to the high volume of network traffic. In the existing system, to detect whether network traffic is normal or abnormal we need lots of information about the network. When lot of information is involved in the identification process the relationship between different attributes and the important attributes consider for classification plays an important role in the accuracy. Information gain selection process is used to pr
APA, Harvard, Vancouver, ISO, and other styles
4

V, Narmatha, and Ramesh S. "Residual Neural Network for the Accurate Recognition of Human Action and Compared with Bayesian Regression." E3S Web of Conferences 399 (2023): 04024. http://dx.doi.org/10.1051/e3sconf/202339904024.

Full text
Abstract:
Aim: In this research article, the aim is to analyze and compare the performance of Residual Neural Network and Bayesian Regression for accurate recognition of human actions. Materials and Methods: The proposed machine learning classifier model uses 80% of the UCF101 dataset for training and the remaining 20% for testing. For the SPSS analysis, the results of two classifiers are grouped with 20 samples in each group. The sample size is determined using a pretest with G-power, with a sample size of 80%, a confidence interval of 95%, and a significance level of 0.014 (p<0.05). Result: The fin
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Limin, Sikai Qi, Yang Liu, Hua Lou, and Xin Zuo. "Bagging k-dependence Bayesian network classifiers." Intelligent Data Analysis 25, no. 3 (2021): 641–67. http://dx.doi.org/10.3233/ida-205125.

Full text
Abstract:
Bagging has attracted much attention due to its simple implementation and the popularity of bootstrapping. By learning diverse classifiers from resampled datasets and averaging the outcomes, bagging investigates the possibility of achieving substantial classification performance of the base classifier. Diversity has been recognized as a very important characteristic in bagging. This paper presents an efficient and effective bagging approach, that learns a set of independent Bayesian network classifiers (BNCs) from disjoint data subspaces. The number of bits needed to describe the data is measu
APA, Harvard, Vancouver, ISO, and other styles
6

Yan, Ke-Sheng, Li-Li Rong, and Kai Yu. "Discriminating complex networks through supervised NDR and Bayesian classifier." International Journal of Modern Physics C 27, no. 05 (2016): 1650051. http://dx.doi.org/10.1142/s0129183116500510.

Full text
Abstract:
Discriminating complex networks is a particularly important task for the purpose of the systematic study of networks. In order to discriminate unknown networks exactly, a large set of network measurements are needed to be taken into account for comprehensively considering network properties. However, as we demonstrate in this paper, these measurements are nonlinear correlated with each other in general, resulting in a wide variety of redundant measurements which unintentionally explain the same aspects of network properties. To solve this problem, we adopt supervised nonlinear dimensionality r
APA, Harvard, Vancouver, ISO, and other styles
7

Ruz, G. A., and D. T. Pham. "Building Bayesian network classifiers through a Bayesian complexity monitoring system." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 223, no. 3 (2008): 743–55. http://dx.doi.org/10.1243/09544062jmes1243.

Full text
Abstract:
Nowadays, the need for practical yet efficient machine learning techniques for engineering applications are highly in demand. A new learning method for building Bayesian network classifiers is presented in this article. The proposed method augments the naive Bayesian (NB) classifier by using the Chow and Liu tree construction method, but introducing a Bayesian approach to control the accuracy and complexity of the resulting network, which yields simple structures that are not necessarily a spanning tree. Experiments by using benchmark data sets show that the number of augmenting edges by using
APA, Harvard, Vancouver, ISO, and other styles
8

Reiz, Beáta, and Lehel Csató. "Bayesian Network Classifier for Medical Data Analysis." International Journal of Computers Communications & Control 4, no. 1 (2009): 65. http://dx.doi.org/10.15837/ijccc.2009.1.2414.

Full text
Abstract:
<p>Bayesian networks encode causal relations between variables using probability and graph theory. They can be used both for prediction of an outcome and interpretation of predictions based on the encoded causal relations. In this paper we analyse a tree-like Bayesian network learning algorithm optimised for classification of data and we give solutions to the interpretation and analysis of predictions. The classification of logical – i.e. binary – data arises specifically in the field of medical diagnosis, where we have to predict the survival chance based on different types of medical o
APA, Harvard, Vancouver, ISO, and other styles
9

Erb, Randall J. "The Backpropagation Neural Network - A Bayesian Classifier." Clinical Pharmacokinetics 29, no. 2 (1995): 69–79. http://dx.doi.org/10.2165/00003088-199529020-00002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

SHI, Hong-bo, Ya-qin LIU, and Ai-jun LI. "Discriminative parameter learning of Bayesian network classifier." Journal of Computer Applications 31, no. 4 (2011): 1074–78. http://dx.doi.org/10.3724/sp.j.1087.2011.01074.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Wang, Limin, Yang Liu, Musa Mammadov, Minghui Sun, and Sikai Qi. "Discriminative Structure Learning of Bayesian Network Classifiers from Training Dataset and Testing Instance." Entropy 21, no. 5 (2019): 489. http://dx.doi.org/10.3390/e21050489.

Full text
Abstract:
Over recent decades, the rapid growth in data makes ever more urgent the quest for highly scalable Bayesian networks that have better classification performance and expressivity (that is, capacity to respectively describe dependence relationships between attributes in different situations). To reduce the search space of possible attribute orders, k-dependence Bayesian classifier (KDB) simply applies mutual information to sort attributes. This sorting strategy is very efficient but it neglects the conditional dependencies between attributes and is sub-optimal. In this paper, we propose a novel
APA, Harvard, Vancouver, ISO, and other styles
12

Mizuta, H., K. Kawachi, H. Yoshida, et al. "Decision Support for Psychiatric Diagnosis Based on a Simple Questionnaire." Methods of Information in Medicine 36, no. 04/05 (1997): 349–51. http://dx.doi.org/10.1055/s-0038-1636858.

Full text
Abstract:
Abstract:This paper compares two classifiers: Pseudo Bayesian and Neural Network for assisting in making diagnoses of psychiatric patients based on a simple yes/no questionnaire which is provided at the outpatient’s first visit to the hospital. The classifiers categorize patients into three most commonly seen ICD classes, i.e. schizophrenic, emotional and neurotic disorders. One hundred completed questionnaires were utilized for constructing and evaluating the classifiers. Average correct decision rates were 73.3% for the Pseudo Bayesian Classifier and 77.3% for the Neural Network classifier.
APA, Harvard, Vancouver, ISO, and other styles
13

Liang, Faming. "An Effective Bayesian Neural Network Classifier with a Comparison Study to Support Vector Machine." Neural Computation 15, no. 8 (2003): 1959–89. http://dx.doi.org/10.1162/08997660360675107.

Full text
Abstract:
We propose a new Bayesian neural network classifier, different from that commonly used in several respects, including the likelihood function, prior specification, and network structure. Under regularity conditions, we show that the decision boundary determined by the new classifier will converge to the true one. We also propose a systematic implementation for the new classifier. In our implementation, the tune of connection weights, the selection of hidden units, and the selection of input variables are unified by sampling from the joint posterior distribution of the network structure and con
APA, Harvard, Vancouver, ISO, and other styles
14

Guo, Dai Fei, Jian Jun Hu, Ai Fen Sui, Guan Zhou Lin, and Tao Guo. "The Abnormal Mobile Malware Analysis Based on Behavior Categorization." Advanced Materials Research 765-767 (September 2013): 994–97. http://dx.doi.org/10.4028/www.scientific.net/amr.765-767.994.

Full text
Abstract:
With the explosive growth of mobile malware in mobile internet, many polymorphic and metamorphic mobile malware appears and causes difficulty of detection. A mobile malware network behavior data mining method based on behavior categorization is proposed to detect the behavior of new or metamorphic mobile malware. The network behavior is divided into different categories after analyzing the behavior character of mobile malware and those different behavior data of known malware and normal action are used to train the Naïve Bayesian classifier respectively. Those Naïve Bayesian classifiers are
APA, Harvard, Vancouver, ISO, and other styles
15

Wu, Hongjiao. "Feature-Weighted Naive Bayesian Classifier for Wireless Network Intrusion Detection." Security and Communication Networks 2024 (January 3, 2024): 1–13. http://dx.doi.org/10.1155/2024/7065482.

Full text
Abstract:
Objective. Wireless sensor networks, crucial for various applications, face growing security challenges due to the escalating complexity and diversity of attack behaviours. This paper presents an advanced intrusion detection algorithm, leveraging feature-weighted Naive Bayes (NB), to enhance network attack detection accuracy. Methodology. Initially, a feature weighting algorithm is introduced to assign context-based weights to different feature terms. Subsequently, the NB algorithm is enhanced by incorporating Jensen–Shannon (JS) divergence, feature weighting, and inverse category frequency (I
APA, Harvard, Vancouver, ISO, and other styles
16

Wang, Li Min, Xiong Fei Li, and Xue Cheng Wang. "Towards Efficient Dimensionality Reduction for Evolving Bayesian Network Classifier." Advanced Materials Research 108-111 (May 2010): 240–43. http://dx.doi.org/10.4028/www.scientific.net/amr.108-111.240.

Full text
Abstract:
Dimensionality reduction is useful for improving the performance of Bayesian networks. In this paper we suggest an effective method of modeling categorical and numerical variables of the mixed data with different Bayesian classifiers. Such an approach reduces output sensitivity to input changes by applying feature extraction and selection, and empirical studies on UCI benchmarking data show that our approach has clear advantages with respect to the classification accuracy.
APA, Harvard, Vancouver, ISO, and other styles
17

Li, Dawei, Xiaojian Hu, Cheng-jie Jin, and Jun Zhou. "Learning to Detect Traffic Incidents from Data Based on Tree Augmented Naive Bayesian Classifiers." Discrete Dynamics in Nature and Society 2017 (2017): 1–9. http://dx.doi.org/10.1155/2017/8523495.

Full text
Abstract:
This study develops a tree augmented naive Bayesian (TAN) classifier based incident detection algorithm. Compared with the Bayesian networks based detection algorithms developed in the previous studies, this algorithm has less dependency on experts’ knowledge. The structure of TAN classifier for incident detection is learned from data. The discretization of continuous attributes is processed using an entropy-based method automatically. A simulation dataset on the section of the Ayer Rajah Expressway (AYE) in Singapore is used to demonstrate the development of proposed algorithm, including wave
APA, Harvard, Vancouver, ISO, and other styles
18

Fong, Li Wei, Pi Ching Lou, and Kung Ting Lin. "On-Line Bayesian Classifier Design for Measurement Fusion." Advanced Materials Research 461 (February 2012): 826–29. http://dx.doi.org/10.4028/www.scientific.net/amr.461.826.

Full text
Abstract:
A neural-network-based classifier design for adaptive Kalman filtering is introduced to fuse the measurements extracted from multiple sensors to improve tracking accuracy. The proposed method consists of a group of parallel Kalman filters and a classifier based on Radial Basis Function Network (RBFN). By incorporating Markov chain into Bayesian estimation scheme, a RBFN is used as a probabilistic neural network for classification. Based upon data compression technique and on-line classification algorithm, an adaptive estimator to measurement fusion is developed that can handle the switching pl
APA, Harvard, Vancouver, ISO, and other styles
19

LANSNER, ANDERS, and ANDERS HOLST. "A HIGHER ORDER BAYESIAN NEURAL NETWORK WITH SPIKING UNITS." International Journal of Neural Systems 07, no. 02 (1996): 115–28. http://dx.doi.org/10.1142/s0129065796000816.

Full text
Abstract:
We treat a Bayesian confidence propagation neural network, primarily in a classifier context. The onelayer version of the network implements a naive Bayesian classifier, which requires the input attributes to be independent. This limitation is overcome by a higher order network. The higher order Bayesian neural network is evaluated on a real world task of diagnosing a telephone exchange computer. By introducing stochastic spiking units, and soft interval coding, it is also possible to handle uncertain as well as continuous valued inputs.
APA, Harvard, Vancouver, ISO, and other styles
20

JUNIOR, GERALDO BRAZ, LEONARDO DE OLIVEIRA MARTINS, ARISTÓFANES CORREA SILVA, and ANSELMO CARDOSO PAIVA. "COMPARISON OF SUPPORT VECTOR MACHINES AND BAYESIAN NEURAL NETWORKS PERFORMANCE FOR BREAST TISSUES USING GEOSTATISTICAL FUNCTIONS IN MAMMOGRAPHIC IMAGES." International Journal of Computational Intelligence and Applications 09, no. 04 (2010): 271–88. http://dx.doi.org/10.1142/s1469026810002914.

Full text
Abstract:
Female breast cancer is a major cause of deaths in occidental countries. Computer-aided Detection (CAD) systems can aid radiologists to increase diagnostic accuracy. In this work, we present a comparison between two classifiers applied to the separation of normal and abnormal breast tissues from mammograms. The purpose of the comparison is to select the best prediction technique to be part of a CAD system. Each region of interest is classified through a Support Vector Machine (SVM) and a Bayesian Neural Network (BNN) as normal or abnormal region. SVM is a machine-learning method, based on the
APA, Harvard, Vancouver, ISO, and other styles
21

Chen Xu. "Research of Coin Recognition Based on Bayesian Network Classifier." INTERNATIONAL JOURNAL ON Advances in Information Sciences and Service Sciences 4, no. 18 (2012): 395–402. http://dx.doi.org/10.4156/aiss.vol4.issue18.48.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Philip, Ninan Sajeeth, and K. Babu Joseph. "Boosting the differences: A fast Bayesian classifier neural network." Intelligent Data Analysis 4, no. 6 (2000): 463–73. http://dx.doi.org/10.3233/ida-2000-4602.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Ren, Hongjia, and Xianchang Wang. "Scalable Structure Learning of K-Dependence Bayesian Network Classifier." IEEE Access 8 (2020): 200005–20. http://dx.doi.org/10.1109/access.2020.3035175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Serra-Ricart, M. "Faint Object Classification Using Artificial Neural Networks." Symposium - International Astronomical Union 161 (1994): 249–52. http://dx.doi.org/10.1017/s0074180900047409.

Full text
Abstract:
Artificial Neural Network techniques are applied to the classification of faint objects, detected in digital astronomical images, and a Bayesian classifier (the neural network classifier, NNC hereafter) is proposed. This classifier can be implemented using a feedforward multilayered neural network trained by the back-propagation procedure (Werbos 1974).
APA, Harvard, Vancouver, ISO, and other styles
25

Luthfi, Emir, and Arie Wahyu Wijayanto. "Bayesian Network Model to Distinguish COVID-19 for Illness with Similar Symptoms." Proceedings of The International Conference on Data Science and Official Statistics 2021, no. 1 (2022): 66–76. http://dx.doi.org/10.34123/icdsos.v2021i1.36.

Full text
Abstract:
Numerous diseases and illnesses exhibit similar physical and medical symptoms, such as COVID-19 and its similar disguised illness (common cold, flu, and seasonal allergies). In this study, we construct a Bayesian Network model to distinguish such symptom variables in a classification task. The Bayesian Network model has been widely used as a classifier comparable to machine learning models. We develop the model with a scoring-based method and implement it using a hill-climbing algorithm with the Bayesian information criterion (BIC) score approach. Experimental evaluations using publicly availa
APA, Harvard, Vancouver, ISO, and other styles
26

Liu, Yang, Limin Wang, and Minghui Sun. "Efficient Heuristics for Structure Learning of k-Dependence Bayesian Classifier." Entropy 20, no. 12 (2018): 897. http://dx.doi.org/10.3390/e20120897.

Full text
Abstract:
The rapid growth in data makes the quest for highly scalable learners a popular one. To achieve the trade-off between structure complexity and classification accuracy, the k-dependence Bayesian classifier (KDB) allows to represent different number of interdependencies for different data sizes. In this paper, we proposed two methods to improve the classification performance of KDB. Firstly, we use the minimal-redundancy-maximal-relevance analysis, which sorts the predictive features to identify redundant ones. Then, we propose an improved discriminative model selection to select an optimal sub-
APA, Harvard, Vancouver, ISO, and other styles
27

Huang, Wen Bo, and Yun Ji Wang. "A New Method for Osteosarcoma Recognition Based on Bayesian Classifier." Applied Mechanics and Materials 543-547 (March 2014): 2901–4. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.2901.

Full text
Abstract:
In order to deal with the complexity and uncertainty in medical image diagnosis of osteosarcoma, we proposed a new method based on Bayesian network, and first applied it to recognize osteosarcoma. A new multidimensional feature vector composed of both biochemical indicator and the quantized image features is defined and used as input to the Bayesian network, so as to establish a more accurate and reliable osteosarcoma recognition probability model. Experimental results demonstrate the effective of our method, there are 50 training samples and 30 testing samples, and the accuracy is up to 86.67
APA, Harvard, Vancouver, ISO, and other styles
28

KONG, Yu-yan, Jin-tao YAO, Qiang LI, Sheng-lin ZHU, and Ming-wu ZHANG. "Bayesian network classifier based on PSO with predatory escape behavior." Journal of Computer Applications 31, no. 2 (2011): 454–57. http://dx.doi.org/10.3724/sp.j.1087.2011.00454.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Du, Rei-Jie, Shuang-Cheng Wang, Han-Xing Wang, and Cui-Ping Leng. "Optimization of Dynamic Naive Bayesian Network Classifier with Continuous Attributes." Advanced Science Letters 11, no. 1 (2012): 676–79. http://dx.doi.org/10.1166/asl.2012.2965.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Villa, S., and F. Stella. "A continuous time Bayesian network classifier for intraday FX prediction." Quantitative Finance 14, no. 12 (2014): 2079–92. http://dx.doi.org/10.1080/14697688.2014.906811.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Liu, Yang, Limin Wang, and Musa Mammadov. "Learning semi-lazy Bayesian network classifier under the c.i.i.d assumption." Knowledge-Based Systems 208 (November 2020): 106422. http://dx.doi.org/10.1016/j.knosys.2020.106422.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Atoui, M. Amine, Achraf Cohen, Sylvain Verron, and Abdessamad Kobi. "A single Bayesian network classifier for monitoring with unknown classes." Engineering Applications of Artificial Intelligence 85 (October 2019): 681–90. http://dx.doi.org/10.1016/j.engappai.2019.07.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Fahad S. Alenazi. "Dynamic Detection and Debias of Bayesian Network Classifier (3D-BN)." Journal of Information Systems Engineering and Management 10, no. 35s (2025): 419–31. https://doi.org/10.52783/jisem.v10i35s.6020.

Full text
Abstract:
Fairness in machine learning is a complex and multifaceted concept, increasingly critical in automated decision-making systems. Numerous metrics and techniques have been developed to measure and mitigate bias effectively. However, tensions often arise between different fairness notions, such as individual versus group fairness, and even among various group fairness approaches. These conflicts are typically rooted in inadequate implementation of fairness measures rather than fundamental contradictions. Additionally, failing to account for interdependencies among attributes can lead to unintende
APA, Harvard, Vancouver, ISO, and other styles
34

Aminu, Tukur Muhammad Abubakar. "A Comparison Between Twitter Based Naïve Bayes and Artificial Neural Network Comment Classification Algorithms." Journal of Science and Technology Research 5, no. 3 (2023): 61–70. https://doi.org/10.5281/zenodo.8283032.

Full text
Abstract:
<em>Machine learning has a wide range of uses, and one of its key uses is classification. A new observation is classified to determine which category it belongs to. Classifiers are the common name for machine learning classifiers. A classifier&#39;s task is to use training data provided to it to determine the relationship between a given input variable and a specific group that has already been identified by the system. Perceptron, Naive Bayes, Decision Tree, Logistic Regression, K-Nearest Neighbor, Artificial Neural Networks/Deep Learning, and Support Vector Machine are a few of the technique
APA, Harvard, Vancouver, ISO, and other styles
35

Lyu, Na, Jiaxin Zhou, Xuan Feng, Kefan Chen, and Wu Chen. "A Timeliness-Enhanced Traffic Identification Method in Airborne Network." Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University 38, no. 2 (2020): 341–50. http://dx.doi.org/10.1051/jnwpu/20203820341.

Full text
Abstract:
High dynamic topology and limited bandwidth of the airborne network make it difficult to provide reliable information interaction services for diverse combat mission of aviation swarm operations. Therefore, it is necessary to identify the elephant flows in the network in real time to optimize the process of traffic control and improve the performance of airborne network. Aiming at this problem, a timeliness-enhanced traffic identification method based on machine learning Bayesian network model is proposed. Firstly, the data flow training subset is obtained by preprocessing the original traffic
APA, Harvard, Vancouver, ISO, and other styles
36

Mie, Mie Aung, Mon Ko Su, Myat Thuzar Win, and Pan Thaw Su. "Classification of Paddy Types using Naïve Bayesian Classifiers." International Journal of Trend in Scientific Research and Development 3, no. 5 (2019): 1355–59. https://doi.org/10.5281/zenodo.3590797.

Full text
Abstract:
Classification is a form of data analysis that can be used extract models describing important data classes or to predict future data trends. Classification is the process of finding a set of models that describe and distinguish data classes or concepts, for the purpose of being able to use the model to predict the class of objects whose class label is unknown. In classification techniques, Na&Atilde;&macr;ve Bayesian Classifier is one of the simplest probabilistic classifiers. This paper is to study the Na&Atilde;&macr;ve Bayesian Classifier and to classify class label of paddy type data usin
APA, Harvard, Vancouver, ISO, and other styles
37

CHINNASAMY, ARUNKUMAR, WING-KIN SUNG, and ANKUSH MITTAL. "PROTEIN STRUCTURE AND FOLD PREDICTION USING TREE-AUGMENTED NAÏVE BAYESIAN CLASSIFIER." Journal of Bioinformatics and Computational Biology 03, no. 04 (2005): 803–19. http://dx.doi.org/10.1142/s0219720005001302.

Full text
Abstract:
Due to the large volume of protein sequence data, computational methods to determine the structure class and the fold class of a protein sequence have become essential. Several techniques based on sequence similarity, Neural Networks, Support Vector Machines (SVMs), etc. have been applied. Since most of these classifiers use binary classifiers for multi-classification, there may be Nc2 classifiers required. This paper presents a framework using the Tree-Augmented Bayesian Networks (TAN) which performs multi-classification based on the theory of learning Bayesian Networks and using improved fea
APA, Harvard, Vancouver, ISO, and other styles
38

Kuzmanić Skelin, Ana, Lea Vojković, Dani Mohović, and Damir Zec. "Weight of Evidence Approach to Maritime Accident Risk Assessment Based on Bayesian Network Classifier." Transactions on Maritime Science 10, no. 2 (2021): 330–47. http://dx.doi.org/10.7225/toms.v10.n02.w07.

Full text
Abstract:
Probabilistic maritime accident models based on Bayesian Networks are typically built upon the data available in accident records and the data obtained from human experts knowledge on accident. The drawback of such models is that they do not take explicitly into the account the knowledge on non-accidents as would be required in the probabilistic modelling of rare events. Consequently, these models have difficulties with delivering interpretation of influence of risk factors and providing sufficient confidence in the risk assessment scores. In this work, modelling and risk score interpretation,
APA, Harvard, Vancouver, ISO, and other styles
39

Dastjerdi, Fereshteh R., and Liming Cai. "Augmenting Naïve Bayes Classifiers with k-Tree Topology." Mathematics 13, no. 13 (2025): 2185. https://doi.org/10.3390/math13132185.

Full text
Abstract:
The Bayesian network is a directed, acyclic graphical model that can offer a structured description for probabilistic dependencies among random variables. As powerful tools for classification tasks, Bayesian classifiers often require computing joint probability distributions, which can be computationally intractable due to potential full dependencies among feature variables. On the other hand, Naïve Bayes, which presumes zero dependencies among features, trades accuracy for efficiency and often comes with underperformance. As a result, non-zero dependency structures, such as trees, are often u
APA, Harvard, Vancouver, ISO, and other styles
40

Zhang, Hao, Kai Biao Lin, Wei Weng, Juan Wen, and Chin Ling Chen. "A Bayesian network correlation-based classifier chain algorithm for multilabel learning." International Journal of Computational Science and Engineering 25, no. 4 (2022): 437. http://dx.doi.org/10.1504/ijcse.2022.124551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Chen, Chin Ling, Juan Wen, Wei Weng, Kai Biao Lin, and Hao Zhang. "A Bayesian network correlation-based classifier chain algorithm for multilabel learning." International Journal of Computational Science and Engineering 1, no. 1 (2021): 1. http://dx.doi.org/10.1504/ijcse.2021.10042448.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Pandey, Sushant Kumar, Ravi Bhushan Mishra, and Anil Kumar Triphathi. "Software Bug Prediction Prototype Using Bayesian Network Classifier: A Comprehensive Model." Procedia Computer Science 132 (2018): 1412–21. http://dx.doi.org/10.1016/j.procs.2018.05.071.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Cai, Zhiqiang, Shudong Sun, Shubin Si, and Bernard Yannou. "Identifying product failure rate based on a conditional Bayesian network classifier." Expert Systems with Applications 38, no. 5 (2011): 5036–43. http://dx.doi.org/10.1016/j.eswa.2010.09.146.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Tao, Jianbin, Qingquan Li, Changqing Zhu, and Jili Li. "A hierarchical naive Bayesian network classifier embedded GMM for textural image." International Journal of Applied Earth Observation and Geoinformation 14, no. 1 (2012): 139–48. http://dx.doi.org/10.1016/j.jag.2011.08.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Tucker, Allan, Veronica Vinciotti, Xiaohui Liu, and David Garway-Heath. "A spatio-temporal Bayesian network classifier for understanding visual field deterioration." Artificial Intelligence in Medicine 34, no. 2 (2005): 163–77. http://dx.doi.org/10.1016/j.artmed.2004.07.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Wang, Ran, Suhe Ye, Ke Li, and Sam Kwong. "Bayesian network based label correlation analysis for multi-label classifier chain." Information Sciences 554 (April 2021): 256–75. http://dx.doi.org/10.1016/j.ins.2020.12.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Gao, SiQi, Hua Lou, LiMin Wang, Yang Liu, and Tiehu Fan. "Universal Target Learning: An Efficient and Effective Technique for Semi-Naive Bayesian Learning." Entropy 21, no. 8 (2019): 729. http://dx.doi.org/10.3390/e21080729.

Full text
Abstract:
To mitigate the negative effect of classification bias caused by overfitting, semi-naive Bayesian techniques seek to mine the implicit dependency relationships in unlabeled testing instances. By redefining some criteria from information theory, Target Learning (TL) proposes to build for each unlabeled testing instance P the Bayesian Network Classifier BNC P , which is independent and complementary to BNC T learned from training data T . In this paper, we extend TL to Universal Target Learning (UTL) to identify redundant correlations between attribute values and maximize the bits encoded in the
APA, Harvard, Vancouver, ISO, and other styles
48

Ou, Guiliang, Yulin He, Philippe Fournier-Viger, and Joshua Zhexue Huang. "A Novel Mixed-Attribute Fusion-Based Naive Bayesian Classifier." Applied Sciences 12, no. 20 (2022): 10443. http://dx.doi.org/10.3390/app122010443.

Full text
Abstract:
The Naive Bayesian classifier (NBC) is a well-known classification model that has a simple structure, low training complexity, excellent scalability, and good classification performances. However, the NBC has two key limitations: (1) it is built upon the strong assumption that condition attributes are independent, which often does not hold in real-life, and (2) the NBC does not handle continuous attributes well. To overcome these limitations, this paper presents a novel approach for NBC construction, called mixed-attribute fusion-based NBC (MAF-NBC). It alleviates the two aforementioned limita
APA, Harvard, Vancouver, ISO, and other styles
49

Butter, Anja, Thorben Finke, Felicitas Keil, Michael Krämer, and Silvia Manconi. "Classification of Fermi-LAT blazars with Bayesian neural networks." Journal of Cosmology and Astroparticle Physics 2022, no. 04 (2022): 023. http://dx.doi.org/10.1088/1475-7516/2022/04/023.

Full text
Abstract:
Abstract The use of Bayesian neural networks is a novel approach for the classification of γ-ray sources. We focus on the classification of Fermi-LAT blazar candidates, which can be divided into BL Lacertae objects and Flat Spectrum Radio Quasars. In contrast to conventional dense networks, Bayesian neural networks provide a reliable estimate of the uncertainty of the network predictions. We explore the correspondence between conventional and Bayesian neural networks and the effect of data augmentation. We find that Bayesian neural networks provide a robust classifier with reliable uncertainty
APA, Harvard, Vancouver, ISO, and other styles
50

Sugahara, Shouta, Koya Kato, and Maomi Ueno. "Learning Bayesian Network Classifiers to Minimize the Class Variable Parameters." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 18 (2024): 20540–49. http://dx.doi.org/10.1609/aaai.v38i18.30039.

Full text
Abstract:
This study proposes and evaluates a new Bayesian network classifier (BNC) having an I-map structure with the fewest class variable parameters among all structures for which the class variable has no parent. Moreover, a new learning algorithm to learn our proposed model is presented. The proposed method is guaranteed to obtain the true classification probability asymptotically. Moreover, the method has lower computational costs than those of exact learning BNC using marginal likelihood. Comparison experiments have demonstrated the superior performance of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!