To see the other types of publications on this topic, follow the link: Decision tree.

Journal articles on the topic 'Decision tree'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Decision tree.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

TOFAN, Cezarina Adina. "Optimization Techniques of Decision Making - Decision Tree." Advances in Social Sciences Research Journal 1, no. 5 (2014): 142–48. http://dx.doi.org/10.14738/assrj.15.437.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Babar, Kiran Nitin. "Performance Evaluation of Decision Trees with Machine Learning Algorithm." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 05 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem34179.

Full text
Abstract:
Decision tree learning is a supervised learning approach used in statistics, data mining and machine learning. Decision trees are considered to be one of the most popular approaches for representing classifiers. Researchers from various disciplines such as statistics, machine learning, pattern recognition and Data Mining have dealt with the issue of growing a decision tree from available data. Decision trees in machine learning will be used for classification problems, to categorize objects to gain an understanding of similar features. Decision trees helps in decision-making by representing co
APA, Harvard, Vancouver, ISO, and other styles
3

Sullivan, Colin, Mo Tiwari, and Sebastian Thrun. "MAPTree: Beating “Optimal” Decision Trees with Bayesian Decision Trees." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 8 (2024): 9019–26. http://dx.doi.org/10.1609/aaai.v38i8.28751.

Full text
Abstract:
Decision trees remain one of the most popular machine learning models today, largely due to their out-of-the-box performance and interpretability. In this work, we present a Bayesian approach to decision tree induction via maximum a posteriori inference of a posterior distribution over trees. We first demonstrate a connection between maximum a posteriori inference of decision trees and AND/OR search. Using this connection, we propose an AND/OR search algorithm, dubbed MAPTree, which is able to recover the maximum a posteriori tree. Lastly, we demonstrate the empirical performance of the maximu
APA, Harvard, Vancouver, ISO, and other styles
4

Guidotti, Riccardo, Anna Monreale, Mattia Setzu, and Giulia Volpi. "Generative Model for Decision Trees." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 19 (2024): 21116–24. http://dx.doi.org/10.1609/aaai.v38i19.30104.

Full text
Abstract:
Decision trees are among the most popular supervised models due to their interpretability and knowledge representation resembling human reasoning. Commonly-used decision tree induction algorithms are based on greedy top-down strategies. Although these approaches are known to be an efficient heuristic, the resulting trees are only locally optimal and tend to have overly complex structures. On the other hand, optimal decision tree algorithms attempt to create an entire decision tree at once to achieve global optimality. We place our proposal between these approaches by designing a generative mod
APA, Harvard, Vancouver, ISO, and other styles
5

Naylor, Mike. "Decision Tree." Mathematics Teacher: Learning and Teaching PK-12 113, no. 7 (2020): 612. http://dx.doi.org/10.5951/mtlt.2020.0081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

BRESLOW, LEONARD A., and DAVID W. AHA. "Simplifying decision trees: A survey." Knowledge Engineering Review 12, no. 01 (1997): 1–40. http://dx.doi.org/10.1017/s0269888997000015.

Full text
Abstract:
Induced decision trees are an extensively-researched solution to classification tasks. For many practical tasks, the trees produced by tree-generation algorithms are not comprehensible to users due to their size and complexity. Although many tree induction algorithms have been shown to produce simpler, more comprehensible trees (or data structures derived from trees) with good classification accuracy, tree simplification has usually been of secondary concern relative to accuracy, and no attempt has been made to survey the literature from the perspective of simplification. We present a framewor
APA, Harvard, Vancouver, ISO, and other styles
7

ZANTEMA, HANS, and HANS L. BODLAENDER. "SIZES OF ORDERED DECISION TREES." International Journal of Foundations of Computer Science 13, no. 03 (2002): 445–58. http://dx.doi.org/10.1142/s0129054102001205.

Full text
Abstract:
Decision tables provide a natural framework for knowledge acquisition and representation in the area of knowledge based information systems. Decision trees provide a standard method for inductive inference in the area of machine learning. In this paper we show how decision tables can be considered as ordered decision trees: decision trees satisfying an ordering restriction on the nodes. Every decision tree can be represented by an equivalent ordered decision tree, but we show that doing so may exponentially blow up sizes, even if the choice of the order is left free. Our main result states tha
APA, Harvard, Vancouver, ISO, and other styles
8

Oo, Aung Nway, and Thin Naing. "Decision Tree Models for Medical Diagnosis." International Journal of Trend in Scientific Research and Development Volume-3, Issue-3 (2019): 1697–99. http://dx.doi.org/10.31142/ijtsrd23510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ostonov, Azimkhon, and Mikhail Moshkov. "Comparative Analysis of Deterministic and Nondeterministic Decision Trees for Decision Tables from Closed Classes." Entropy 26, no. 6 (2024): 519. http://dx.doi.org/10.3390/e26060519.

Full text
Abstract:
In this paper, we consider classes of decision tables with many-valued decisions closed under operations of the removal of columns, the changing of decisions, the permutation of columns, and the duplication of columns. We study relationships among three parameters of these tables: the complexity of a decision table (if we consider the depth of the decision trees, then the complexity of a decision table is the number of columns in it), the minimum complexity of a deterministic decision tree, and the minimum complexity of a nondeterministic decision tree. We consider the rough classification of
APA, Harvard, Vancouver, ISO, and other styles
10

ZANTEMA, HANS, and HANS L. BODLAENDER. "FINDING SMALL EQUIVALENT DECISION TREES IS HARD." International Journal of Foundations of Computer Science 11, no. 02 (2000): 343–54. http://dx.doi.org/10.1142/s0129054100000193.

Full text
Abstract:
Two decision trees are called decision equivalent if they represent the same function, i.e., they yield the same result for every possible input. We prove that given a decision tree and a number, to decide if there is a decision equivalent decision tree of size at most that number is NP-complete. As a consequence, finding a decision tree of minimal size that is decision equivalent to a given decision tree is an NP-hard problem. This result differs from the well-known result of NP-hardness of finding a decision tree of minimal size that is consistent with a given training set. Instead our resul
APA, Harvard, Vancouver, ISO, and other styles
11

Cockett, J. R. B. "Decision Expression Optimization1." Fundamenta Informaticae 10, no. 1 (1987): 93–114. http://dx.doi.org/10.3233/fi-1987-10107.

Full text
Abstract:
A basic concern when using decision trees for the solution of taxonomic or similar problems is their efficiency. Often the information that is required to completely optimize a tree is simply not available. This is especially the case when a criterion based on probabilities is used. It is shown how it is often possible, despite the absence of this information, to improve the design of the tree. The approach is based on algebraic methods for manipulating decision trees and the identification of some particularly desirable forms.
APA, Harvard, Vancouver, ISO, and other styles
12

Rautenberg, Tamlyn, Annette Gerritsen, and Martin Downes. "Health Economic Decision Tree Models of Diagnostics for Dummies: A Pictorial Primer." Diagnostics 10, no. 3 (2020): 158. http://dx.doi.org/10.3390/diagnostics10030158.

Full text
Abstract:
Health economics is a discipline of economics applied to health care. One method used in health economics is decision tree modelling, which extrapolates the cost and effectiveness of competing interventions over time. Such decision tree models are the basis of reimbursement decisions in countries using health technology assessment for decision making. In many instances, these competing interventions are diagnostic technologies. Despite a wealth of excellent resources describing the decision analysis of diagnostics, two critical errors persist: not including diagnostic test accuracy in the stru
APA, Harvard, Vancouver, ISO, and other styles
13

Li, Jiawei, Yiming Li, Xingchun Xiang, Shu-Tao Xia, Siyi Dong, and Yun Cai. "TNT: An Interpretable Tree-Network-Tree Learning Framework using Knowledge Distillation." Entropy 22, no. 11 (2020): 1203. http://dx.doi.org/10.3390/e22111203.

Full text
Abstract:
Deep Neural Networks (DNNs) usually work in an end-to-end manner. This makes the trained DNNs easy to use, but they remain an ambiguous decision process for every test case. Unfortunately, the interpretability of decisions is crucial in some scenarios, such as medical or financial data mining and decision-making. In this paper, we propose a Tree-Network-Tree (TNT) learning framework for explainable decision-making, where the knowledge is alternately transferred between the tree model and DNNs. Specifically, the proposed TNT learning framework exerts the advantages of different models at differ
APA, Harvard, Vancouver, ISO, and other styles
14

Zhang, Hui. "The Analysis of English Sentence Components Based on Decision Tree Classification Algorithm." Highlights in Science, Engineering and Technology 23 (December 3, 2022): 317–20. http://dx.doi.org/10.54097/hset.v23i.3617.

Full text
Abstract:
Decision tree is an important classification method in data excavation technology. It is a predictive analysis model expressed in the form of a tree structure (including binary trees and poly trees). The decision tree method is a more general classification function approximation method. It is an algorithm commonly used in predictive models to find some potentially valuable information by purposefully classifying a large amount of data. In this article, the author tries to analyze the English sentence components based on the decision tree classification algorithm. The author starts with the de
APA, Harvard, Vancouver, ISO, and other styles
15

Yun, Jooyeol, Jun won Seo, and Taeseon Yoon. "Fuzzy Decision Tree." International Journal of Fuzzy Logic Systems 4, no. 3 (2014): 7–11. http://dx.doi.org/10.5121/ijfls.2014.4302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Manwani, N., and P. S. Sastry. "Geometric Decision Tree." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42, no. 1 (2012): 181–92. http://dx.doi.org/10.1109/tsmcb.2011.2163392.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Zhou, Zhi-Hua, and Zhao-Qian Chen. "Hybrid decision tree." Knowledge-Based Systems 15, no. 8 (2002): 515–28. http://dx.doi.org/10.1016/s0950-7051(02)00038-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Koodiaroff, Sally. "Oncology Decision Tree." Collegian 7, no. 3 (2000): 34–36. http://dx.doi.org/10.1016/s1322-7696(08)60375-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Hayes, Karen W., and Becky Wojcik. "Decision Tree Structure." Physical Therapy 69, no. 12 (1989): 1120–22. http://dx.doi.org/10.1093/ptj/69.12.1120.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Cockett, J. R. B., and J. A. Herrera. "Decision tree reduction." Journal of the ACM 37, no. 4 (1990): 815–42. http://dx.doi.org/10.1145/96559.96576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

López-Chau, Asdrúbal, Jair Cervantes, Lourdes López-García, and Farid García Lamont. "Fisher’s decision tree." Expert Systems with Applications 40, no. 16 (2013): 6283–91. http://dx.doi.org/10.1016/j.eswa.2013.05.044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Maazouzi, Faiz, and Halima Bahi. "Using multi decision tree technique to improving decision tree classifier." International Journal of Business Intelligence and Data Mining 7, no. 4 (2012): 274. http://dx.doi.org/10.1504/ijbidm.2012.051712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Vidal, Joseph, Spriha Jha, Zhenyuan Liang, Ethan Delgado, Bereket Siraw Deneke, and Dennis Shasha. "Dynamic Decision Trees." Knowledge 4, no. 4 (2024): 506–42. http://dx.doi.org/10.3390/knowledge4040027.

Full text
Abstract:
Knowledge comes in various forms: scientific, artistic, legal, and many others. For most non-computer scientists, it is far easier to express their knowledge in text than in programming code. The dynamic decision tree system is a system for supporting the authoring of expertise in text form and navigation via an interface that limits the cognitive load on the reader. Specifically, as the reader answers questions, relevant tree nodes appear and irrelevant ones disappear. Searching by a keyword can help to navigate the tree. Database calls bring in information from external datasets. Links bring
APA, Harvard, Vancouver, ISO, and other styles
24

Hajjej, Fahima, Manal Abdullah Alohali, Malek Badr, and Md Adnan Rahman. "A Comparison of Decision Tree Algorithms in the Assessment of Biomedical Data." BioMed Research International 2022 (July 7, 2022): 1–9. http://dx.doi.org/10.1155/2022/9449497.

Full text
Abstract:
By comparing the performance of various tree algorithms, we can determine which one is most useful for analyzing biomedical data. In artificial intelligence, decision trees are a classification model known for their visual aid in making decisions. WEKA software will evaluate biological data from real patients to see how well the decision tree classification algorithm performs. Another goal of this comparison is to assess whether or not decision trees can serve as an effective tool for medical diagnosis in general. In doing so, we will be able to see which algorithms are the most efficient and
APA, Harvard, Vancouver, ISO, and other styles
25

Sharma, Dr Nirmla, and Sameera Iqbal Muhmmad Iqbal. "Applying Decision Tree Algorithm Classification and Regression Tree (CART) Algorithm to Gini Techniques Binary Splits." International Journal of Engineering and Advanced Technology 12, no. 5 (2023): 77–81. http://dx.doi.org/10.35940/ijeat.e4195.0612523.

Full text
Abstract:
Decision tree study is a predictive modelling tool that is used over many grounds. It is constructed through an algorithmic technique that is divided the dataset in different methods created on varied conditions. Decisions trees are the extreme dominant algorithms that drop under the set of supervised algorithms. However, Decision Trees appearance modest and natural, there is nothing identical modest near how the algorithm drives nearby the procedure determining on splits and how tree snipping happens. The initial object to appreciate in Decision Trees is that it splits the analyst field, i.e.
APA, Harvard, Vancouver, ISO, and other styles
26

Dr., Nirmla Sharma, and Iqbal Muhmmad Iqbal Sameera. "Applying Decision Tree Algorithm Classification and Regression Tree (CART) Algorithm to Gini Techniques Binary Splits." International Journal of Engineering and Advanced Technology (IJEAT) 12, no. 5 (2023): 77–81. https://doi.org/10.35940/ijeat.E4195.0612523.

Full text
Abstract:
<strong>Abstract: </strong>Decision tree study is a predictive modelling tool that is used over many grounds. It is constructed through an algorithmic technique that is divided the dataset in different methods created on varied conditions. Decisions trees are the extreme dominant algorithms that drop under the set of supervised algorithms. However, Decision Trees appearance modest and natural, there is nothing identical modest near how the algorithm drives nearby the procedure determining on splits and how tree snipping happens. The initial object to appreciate in Decision Trees is that it spl
APA, Harvard, Vancouver, ISO, and other styles
27

Tetteh, Evans Teiko, and Beata Zielosko. "Greedy Algorithm for Deriving Decision Rules from Decision Tree Ensembles." Entropy 27, no. 1 (2025): 35. https://doi.org/10.3390/e27010035.

Full text
Abstract:
This study introduces a greedy algorithm for deriving decision rules from decision tree ensembles, targeting enhanced interpretability and generalization in distributed data environments. Decision rules, known for their transparency, provide an accessible method for knowledge extraction from data, facilitating decision-making processes across diverse fields. Traditional decision tree algorithms, such as CART and ID3, are employed to induce decision trees from bootstrapped datasets, which represent distributed data sources. Subsequently, a greedy algorithm is applied to derive decision rules th
APA, Harvard, Vancouver, ISO, and other styles
28

Aisyah, Siti. "LOAN STATUS PREDICTION USING DECISION TREE CLASSIFIER." Power Elektronik : Jurnal Orang Elektro 13, no. 1 (2024): 68–70. http://dx.doi.org/10.30591/polektro.v12i3.6591.

Full text
Abstract:
This paper investigates the effectiveness of the Decision Tree Classifier in predicting loan status, a critical task in the financial sector. The study utilizes a dataset containing various attributes of loan applicants such as income, credit score, employment status, and loan amount. The dataset is preprocessed to handle missing values and categorical variables. Feature importance is analyzed to understand the key factors influencing loan approval decisions. A Decision Tree Classifier model is trained and evaluated using performance metrics such as accuracy, precision, recall, and F1-score. T
APA, Harvard, Vancouver, ISO, and other styles
29

Yang, Bin-Bin, Song-Qing Shen, and Wei Gao. "Weighted Oblique Decision Trees." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5621–27. http://dx.doi.org/10.1609/aaai.v33i01.33015621.

Full text
Abstract:
Decision trees have attracted much attention during the past decades. Previous decision trees include axis-parallel and oblique decision trees; both of them try to find the best splits via exhaustive search or heuristic algorithms in each iteration. Oblique decision trees generally simplify tree structure and take better performance, but are always accompanied with higher computation, as well as the initialization with the best axis-parallel splits. This work presents the Weighted Oblique Decision Tree (WODT) based on continuous optimization with random initialization. We consider different we
APA, Harvard, Vancouver, ISO, and other styles
30

Jiang, Daniel R., Lina Al-Kanj, and Warren B. Powell. "Optimistic Monte Carlo Tree Search with Sampled Information Relaxation Dual Bounds." Operations Research 68, no. 6 (2020): 1678–97. http://dx.doi.org/10.1287/opre.2019.1939.

Full text
Abstract:
In the paper, “Optimistic Monte Carlo Tree Search with Sampled Information Relaxation Dual Bounds,” the authors propose an extension to Monte Carlo tree search that uses the idea of “sampling the future” to produce noisy upper bounds on nodes in the decision tree. These upper bounds can help guide the tree expansion process and produce decision trees that are deeper rather than wider, in effect concentrating computation toward more useful parts of the state space. The algorithm’s effectiveness is illustrated in a ride-sharing setting, where a driver/vehicle needs to make dynamic decisions rega
APA, Harvard, Vancouver, ISO, and other styles
31

Aung, Nway Oo, and Naing Thin. "Decision Tree Models for Medical Diagnosis." International Journal of Trend in Scientific Research and Development 3, no. 3 (2019): 1697–99. https://doi.org/10.31142/ijtsrd23510.

Full text
Abstract:
Data mining techniques are rapidly developed for many applications. In recent year, Data mining in healthcare is an emerging field research and development of intelligent medical diagnosis system. Classification is the major research topic in data mining. Decision trees are popular methods for classification. In this paper many decision tree classifiers are used for diagnosis of medical datasets. AD Tree, J48, NB Tree, Random Tree and Random Forest algorithms are used for analysis of medical dataset. Heart disease dataset, Diabetes dataset and Hepatitis disorder dataset are used to test the de
APA, Harvard, Vancouver, ISO, and other styles
32

Parlindungan and HariSupriadi. "Implementation Decision Tree Algorithm for Ecommerce Website." International Journal of Psychosocial Rehabilitation 24, no. 02 (2020): 3611–14. http://dx.doi.org/10.37200/ijpr/v24i2/pr200682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Naveen Kumar, Nallamothu. "Model of Decision Tree for Email Classification." International Journal of Science and Research (IJSR) 11, no. 7 (2022): 1502–5. http://dx.doi.org/10.21275/sr22722110223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Muhammad Sani, Anas, Ahmad Salihu BenMusa, and Muhammad Haladu. "In-Depth Study of Decision Tree Model." International Journal of Science and Research (IJSR) 10, no. 11 (2021): 705–9. https://doi.org/10.21275/mr211102051237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Pathan, Shabana, and Sanjeev Kumar Sharma. "Design an Optimal Decision Tree based Algorithm to Improve Model Prediction Performance." International Journal on Recent and Innovation Trends in Computing and Communication 11, no. 6 (2023): 127–33. http://dx.doi.org/10.17762/ijritcc.v11i6.7295.

Full text
Abstract:
Performance of decision trees is assessed by prediction accuracy for unobserved occurrences. In order to generate optimised decision trees with high classification accuracy and smaller decision trees, this study will pre-process the data. In this study, some decision tree components are addressed and enhanced. The algorithms should produce precise and ideal decision trees in order to increase prediction performance. Additionally, it hopes to create a decision tree algorithm with a tiny global footprint and excellent forecast accuracy. The typical decision tree-based technique was created for c
APA, Harvard, Vancouver, ISO, and other styles
36

Kang, Donggil, WenXing Yu, and HyungJun Cho. "Decision Tree for Mode Estimation." Korean Data Analysis Society 25, no. 3 (2023): 903–11. http://dx.doi.org/10.37727/jkdas.2023.25.3.903.

Full text
Abstract:
Decision trees are one of the data mining techniques that make predictions by recursively partitioning data structures based on split rules. Since the analysis results can be understood through the tree structure, it has the advantage of having high interpretation power as well as predictive power. In addition, it is used in many fields because it is able to identify nonlinear relationships between response and predictor variables. However, if the purpose of it is to predict the mode of the response variable, there is a limitation in that the previously proposed decision tree cannot be applied
APA, Harvard, Vancouver, ISO, and other styles
37

Bechler-Speicher, Maya, Amir Globerson, and Ran Gilad-Bachrach. "TREE-G: Decision Trees Contesting Graph Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 10 (2024): 11032–42. http://dx.doi.org/10.1609/aaai.v38i10.28979.

Full text
Abstract:
When dealing with tabular data, models based on decision trees are a popular choice due to their high accuracy on these data types, their ease of application, and explainability properties. However, when it comes to graph-structured data, it is not clear how to apply them effectively, in a way that in- corporates the topological information with the tabular data available on the vertices of the graph. To address this challenge, we introduce TREE-G. TREE-G modifies standard decision trees, by introducing a novel split function that is specialized for graph data. Not only does this split functio
APA, Harvard, Vancouver, ISO, and other styles
38

I., Dwaraka Srihith, Vijaya Lakshmi P., David Donald A., Aditya Sai Srinivas T., and Thippanna G. "A Forest of Possibilities: Decision Trees and Beyond." Journal of Advancement in Parallel Computing 6, no. 3 (2023): 29–37. https://doi.org/10.5281/zenodo.8372196.

Full text
Abstract:
<em>Decision trees are fundamental in machine learning due to their interpretability and versatility. They are hierarchical structures used for classification and regression tasks, making decisions by recursively splitting data based on features. This abstract explores decision tree algorithms, tree construction, pruning to prevent overfitting, and ensemble methods like Random Forests. Additionally, it covers handling categorical data, imbalanced datasets, missing values, and hyperparameter tuning. Decision trees are valuable for feature selection and model interpretability. However, they have
APA, Harvard, Vancouver, ISO, and other styles
39

Cai, Yuliang, Huaguang Zhang, Qiang He, and Shaoxin Sun. "New classification technique: fuzzy oblique decision tree." Transactions of the Institute of Measurement and Control 41, no. 8 (2018): 2185–95. http://dx.doi.org/10.1177/0142331218774614.

Full text
Abstract:
Based on axiomatic fuzzy set (AFS) theory and fuzzy information entropy, a novel fuzzy oblique decision tree (FODT) algorithm is proposed in this paper. Traditional axis-parallel decision trees only consider a single feature at each non-leaf node, while oblique decision trees partition the feature space with an oblique hyperplane. By contrast, the FODT takes dynamic mining fuzzy rules as a decision function. The main idea of the FODT is to use these fuzzy rules to construct leaf nodes for each class in each layer of the tree; the samples that cannot be covered by the fuzzy rules are then put i
APA, Harvard, Vancouver, ISO, and other styles
40

Tsehay, Admassu Assegie, Kumar Napa Komal, Thulasi Thiyagu, Kalyan Kumar Angati, Jeyanthiran Thiruvarasu Vasantha Priya Maran, and Dhamodaran Vigneswari. "Scalability and performance of decision tree for cardiovascular disease prediction." IAES International Journal of Artificial Intelligence (IJ-AI) 13, no. 3 (2024): 2540–45. https://doi.org/10.11591/ijai.v13.i3.pp2540-2545.

Full text
Abstract:
As one of the most common types of disease, cardiovascular disease is a serious health concern worldwide. Early detection is crucial for successful treatment and improved survival rates. The decision tree is a robust classifier for predicting the risk of cardiovascular disease and getting insights that would assist in making clinical decisions. However, selecting a better model for cardiovascular disease could be challenging due to scalability issues. Hence, this study examines the scalability and performance of decision trees for cardiovascular disease prediction. The study evaluated the perf
APA, Harvard, Vancouver, ISO, and other styles
41

Hernández, Víctor Adrián Sosa, Raúl Monroy, Miguel Angel Medina-Pérez, Octavio Loyola-González, and Francisco Herrera. "A Practical Tutorial for Decision Tree Induction." ACM Computing Surveys 54, no. 1 (2021): 1–38. http://dx.doi.org/10.1145/3429739.

Full text
Abstract:
Experts from different domains have resorted to machine learning techniques to produce explainable models that support decision-making. Among existing techniques, decision trees have been useful in many application domains for classification. Decision trees can make decisions in a language that is closer to that of the experts. Many researchers have attempted to create better decision tree models by improving the components of the induction algorithm. One of the main components that have been studied and improved is the evaluation measure for candidate splits. In this article, we introduce a t
APA, Harvard, Vancouver, ISO, and other styles
42

Murphy, P. M., and M. J. Pazzani. "Exploring the Decision Forest: An Empirical Investigation of Occam's Razor in Decision Tree Induction." Journal of Artificial Intelligence Research 1 (March 1, 1994): 257–75. http://dx.doi.org/10.1613/jair.41.

Full text
Abstract:
We report on a series of experiments in which all decision trees consistent with the training data are constructed. These experiments were run to gain an understanding of the properties of the set of consistent decision trees and the factors that affect the accuracy of individual trees. In particular, we investigated the relationship between the size of a decision tree consistent with some training data and the accuracy of the tree on test data. The experiments were performed on a massively parallel Maspar computer. The results of the experiments on several artificial and two real world proble
APA, Harvard, Vancouver, ISO, and other styles
43

Chang, Namsik, and Olivia R. Liu Sheng. "Decision-Tree-Based Knowledge Discovery: Single- vs. Multi-Decision-Tree Induction." INFORMS Journal on Computing 20, no. 1 (2008): 46–54. http://dx.doi.org/10.1287/ijoc.1060.0215.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

McTavish, Hayden, Chudi Zhong, Reto Achermann, et al. "Fast Sparse Decision Tree Optimization via Reference Ensembles." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 9 (2022): 9604–13. http://dx.doi.org/10.1609/aaai.v36i9.21194.

Full text
Abstract:
Sparse decision tree optimization has been one of the most fundamental problems in AI since its inception and is a challenge at the core of interpretable machine learning. Sparse decision tree optimization is computationally hard, and despite steady effort since the 1960's, breakthroughs have been made on the problem only within the past few years, primarily on the problem of finding optimal sparse decision trees. However, current state-of-the-art algorithms often require impractical amounts of computation time and memory to find optimal or near-optimal trees for some real-world datasets, part
APA, Harvard, Vancouver, ISO, and other styles
45

Luna, José Marcio, Efstathios D. Gennatas, Lyle H. Ungar, et al. "Building more accurate decision trees with the additive tree." Proceedings of the National Academy of Sciences 116, no. 40 (2019): 19887–93. http://dx.doi.org/10.1073/pnas.1816748116.

Full text
Abstract:
The expansion of machine learning to high-stakes application domains such as medicine, finance, and criminal justice, where making informed decisions requires clear understanding of the model, has increased the interest in interpretable machine learning. The widely used Classification and Regression Trees (CART) have played a major role in health sciences, due to their simple and intuitive explanation of predictions. Ensemble methods like gradient boosting can improve the accuracy of decision trees, but at the expense of the interpretability of the generated model. Additive models, such as tho
APA, Harvard, Vancouver, ISO, and other styles
46

Kutikuppala, Saikiran. "Decision Tree Learning Based Feature Selection and Evaluation for Image Classification." International Journal for Research in Applied Science and Engineering Technology 11, no. 6 (2023): 2668–74. http://dx.doi.org/10.22214/ijraset.2023.54035.

Full text
Abstract:
Abstract: The problem statement focuses on feature evaluation and selection for image classification using decision tree learning. The objective is to identify the most significant features in an image dataset and train a decision tree classifier using these selected features. The accuracy of an image classifier heavily relies on the quality and relevance of the features used to represent the images. Hence, it is crucial to identify the most important features and eliminate the irrelevant ones to enhance the classifier's accuracy. To implement this approach, we can utilize scikit-learn, a popu
APA, Harvard, Vancouver, ISO, and other styles
47

LAST, MARK, ODED MAIMON, and EINAT MINKOV. "IMPROVING STABILITY OF DECISION TREES." International Journal of Pattern Recognition and Artificial Intelligence 16, no. 02 (2002): 145–59. http://dx.doi.org/10.1142/s0218001402001599.

Full text
Abstract:
Decision-tree algorithms are known to be unstable: small variations in the training set can result in different trees and different predictions for the same validation examples. Both accuracy and stability can be improved by learning multiple models from bootstrap samples of training data, but the "meta-learner" approach makes the extracted knowledge hardly interpretable. In the following paper, we present the Info-Fuzzy Network (IFN), a novel information-theoretic method for building stable and comprehensible decision-tree models. The stability of the IFN algorithm is ensured by restricting t
APA, Harvard, Vancouver, ISO, and other styles
48

She, Wei, Hong Li, Guo Qing Yu, and Rui Deng. "Two-Stage Constructing Hyper-Plane for Each Test Node of Decision Tree." Applied Mechanics and Materials 26-28 (June 2010): 776–79. http://dx.doi.org/10.4028/www.scientific.net/amm.26-28.776.

Full text
Abstract:
How to construct the “appropriate” split hyper-plane in test nodes is the key of building decision trees. Unlike a univariate decision tree, a multivariate (oblique) decision tree could find the hyper-plane that is not orthogonal to the features’ axes. In this paper, we re-explain the process of building test nodes in terms of geometry. Based on this, we propose a method of learning the hyper-plane with two stages. The tree (TSDT) induced in this way keeps the interpretability of univariate decision trees and the trait of multivariate decision trees which could find oblique hyper-plane. The te
APA, Harvard, Vancouver, ISO, and other styles
49

Wang, Zijun, and Keke Gai. "Decision Tree-Based Federated Learning: A Survey." Blockchains 2, no. 1 (2024): 40–60. http://dx.doi.org/10.3390/blockchains2010003.

Full text
Abstract:
Federated learning (FL) has garnered significant attention as a novel machine learning technique that enables collaborative training among multiple parties without exposing raw local data. In comparison to traditional neural networks or linear models, decision tree models offer higher simplicity and interpretability. The integration of FL technology with decision tree models holds immense potential for performance enhancement and privacy improvement. One current challenge is to identify methods for training and prediction of decision tree models in the FL environment. This survey addresses thi
APA, Harvard, Vancouver, ISO, and other styles
50

Heshmatol Vaezin, S. M., J. L. Peyron, and F. Lecocq. "A simple generalization of the Faustmann formula to tree level." Canadian Journal of Forest Research 39, no. 4 (2009): 699–711. http://dx.doi.org/10.1139/x08-202.

Full text
Abstract:
The economic decision model serving as an objective function in forest economics was conceived originally by Faustmann at the stand level. Nevertheless, the tree level seems to be an appropriate scale for analysis, especially for harvesting decisions and certain estimations both at tree and stand levels. However, the Faustmann formula cannot be directly applied to the tree level. The present research has provided certain tree-level formulations of the Faustmann formula, including, in particular, tree expectation value (TEV) and land expectation value (LEV). TEV and tree-level LEV formulas were
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!