Gotowa bibliografia na temat „Decision tree”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Decision tree”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Decision tree"

1

TOFAN, Cezarina Adina. "Optimization Techniques of Decision Making - Decision Tree." Advances in Social Sciences Research Journal 1, no. 5 (2014): 142–48. http://dx.doi.org/10.14738/assrj.15.437.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Babar, Kiran Nitin. "Performance Evaluation of Decision Trees with Machine Learning Algorithm." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 05 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem34179.

Pełny tekst źródła
Streszczenie:
Decision tree learning is a supervised learning approach used in statistics, data mining and machine learning. Decision trees are considered to be one of the most popular approaches for representing classifiers. Researchers from various disciplines such as statistics, machine learning, pattern recognition and Data Mining have dealt with the issue of growing a decision tree from available data. Decision trees in machine learning will be used for classification problems, to categorize objects to gain an understanding of similar features. Decision trees helps in decision-making by representing co
Style APA, Harvard, Vancouver, ISO itp.
3

Sullivan, Colin, Mo Tiwari, and Sebastian Thrun. "MAPTree: Beating “Optimal” Decision Trees with Bayesian Decision Trees." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 8 (2024): 9019–26. http://dx.doi.org/10.1609/aaai.v38i8.28751.

Pełny tekst źródła
Streszczenie:
Decision trees remain one of the most popular machine learning models today, largely due to their out-of-the-box performance and interpretability. In this work, we present a Bayesian approach to decision tree induction via maximum a posteriori inference of a posterior distribution over trees. We first demonstrate a connection between maximum a posteriori inference of decision trees and AND/OR search. Using this connection, we propose an AND/OR search algorithm, dubbed MAPTree, which is able to recover the maximum a posteriori tree. Lastly, we demonstrate the empirical performance of the maximu
Style APA, Harvard, Vancouver, ISO itp.
4

Guidotti, Riccardo, Anna Monreale, Mattia Setzu, and Giulia Volpi. "Generative Model for Decision Trees." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 19 (2024): 21116–24. http://dx.doi.org/10.1609/aaai.v38i19.30104.

Pełny tekst źródła
Streszczenie:
Decision trees are among the most popular supervised models due to their interpretability and knowledge representation resembling human reasoning. Commonly-used decision tree induction algorithms are based on greedy top-down strategies. Although these approaches are known to be an efficient heuristic, the resulting trees are only locally optimal and tend to have overly complex structures. On the other hand, optimal decision tree algorithms attempt to create an entire decision tree at once to achieve global optimality. We place our proposal between these approaches by designing a generative mod
Style APA, Harvard, Vancouver, ISO itp.
5

Naylor, Mike. "Decision Tree." Mathematics Teacher: Learning and Teaching PK-12 113, no. 7 (2020): 612. http://dx.doi.org/10.5951/mtlt.2020.0081.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

BRESLOW, LEONARD A., and DAVID W. AHA. "Simplifying decision trees: A survey." Knowledge Engineering Review 12, no. 01 (1997): 1–40. http://dx.doi.org/10.1017/s0269888997000015.

Pełny tekst źródła
Streszczenie:
Induced decision trees are an extensively-researched solution to classification tasks. For many practical tasks, the trees produced by tree-generation algorithms are not comprehensible to users due to their size and complexity. Although many tree induction algorithms have been shown to produce simpler, more comprehensible trees (or data structures derived from trees) with good classification accuracy, tree simplification has usually been of secondary concern relative to accuracy, and no attempt has been made to survey the literature from the perspective of simplification. We present a framewor
Style APA, Harvard, Vancouver, ISO itp.
7

ZANTEMA, HANS, and HANS L. BODLAENDER. "SIZES OF ORDERED DECISION TREES." International Journal of Foundations of Computer Science 13, no. 03 (2002): 445–58. http://dx.doi.org/10.1142/s0129054102001205.

Pełny tekst źródła
Streszczenie:
Decision tables provide a natural framework for knowledge acquisition and representation in the area of knowledge based information systems. Decision trees provide a standard method for inductive inference in the area of machine learning. In this paper we show how decision tables can be considered as ordered decision trees: decision trees satisfying an ordering restriction on the nodes. Every decision tree can be represented by an equivalent ordered decision tree, but we show that doing so may exponentially blow up sizes, even if the choice of the order is left free. Our main result states tha
Style APA, Harvard, Vancouver, ISO itp.
8

Oo, Aung Nway, and Thin Naing. "Decision Tree Models for Medical Diagnosis." International Journal of Trend in Scientific Research and Development Volume-3, Issue-3 (2019): 1697–99. http://dx.doi.org/10.31142/ijtsrd23510.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Ostonov, Azimkhon, and Mikhail Moshkov. "Comparative Analysis of Deterministic and Nondeterministic Decision Trees for Decision Tables from Closed Classes." Entropy 26, no. 6 (2024): 519. http://dx.doi.org/10.3390/e26060519.

Pełny tekst źródła
Streszczenie:
In this paper, we consider classes of decision tables with many-valued decisions closed under operations of the removal of columns, the changing of decisions, the permutation of columns, and the duplication of columns. We study relationships among three parameters of these tables: the complexity of a decision table (if we consider the depth of the decision trees, then the complexity of a decision table is the number of columns in it), the minimum complexity of a deterministic decision tree, and the minimum complexity of a nondeterministic decision tree. We consider the rough classification of
Style APA, Harvard, Vancouver, ISO itp.
10

ZANTEMA, HANS, and HANS L. BODLAENDER. "FINDING SMALL EQUIVALENT DECISION TREES IS HARD." International Journal of Foundations of Computer Science 11, no. 02 (2000): 343–54. http://dx.doi.org/10.1142/s0129054100000193.

Pełny tekst źródła
Streszczenie:
Two decision trees are called decision equivalent if they represent the same function, i.e., they yield the same result for every possible input. We prove that given a decision tree and a number, to decide if there is a decision equivalent decision tree of size at most that number is NP-complete. As a consequence, finding a decision tree of minimal size that is decision equivalent to a given decision tree is an NP-hard problem. This result differs from the well-known result of NP-hardness of finding a decision tree of minimal size that is consistent with a given training set. Instead our resul
Style APA, Harvard, Vancouver, ISO itp.
Więcej źródeł

Rozprawy doktorskie na temat "Decision tree"

1

Yu, Peng. "Improving Decision Tree Learning." Electronic Thesis or Diss., Institut polytechnique de Paris, 2024. http://www.theses.fr/2024IPPAT037.

Pełny tekst źródła
Streszczenie:
La modélisation par arbres de décision est reconnue pour son efficacité et sa lisibilité, notamment pour les données structurées. Cette thèse s’attaque à deux défis majeurs : l’interprétabilité des arbres profonds et la gestion des variables catégorielles.Nous présentons l’algorithme Linear Tree- Shap, qui facilite l’explication du processus décisionnel en attribuant des scores d’importance à chaque noeud et variable. Parallèlement, nous proposons un cadre méthodologique pour traiter directement les variables catégorielles, améliorant à la fois la précision et la robustesse du modèle. Notre ap
Style APA, Harvard, Vancouver, ISO itp.
2

Shi, Haijian. "Best-first Decision Tree Learning." The University of Waikato, 2007. http://hdl.handle.net/10289/2317.

Pełny tekst źródła
Streszczenie:
In best-first top-down induction of decision trees, the best split is added in each step (e.g. the split that maximally reduces the Gini index). This is in contrast to the standard depth-first traversal of a tree. The resulting tree will be the same, just how it is built is different. The objective of this project is to investigate whether it is possible to determine an appropriate tree size on practical datasets by combining best-first decision tree growth with cross-validation-based selection of the number of expansions that are performed. Pre-pruning, post-pruning, CART-pruning can be perfo
Style APA, Harvard, Vancouver, ISO itp.
3

Vella, Alan. "Hyper-heuristic decision tree induction." Thesis, Heriot-Watt University, 2012. http://hdl.handle.net/10399/2540.

Pełny tekst źródła
Streszczenie:
A hyper-heuristic is any algorithm that searches or operates in the space of heuristics as opposed to the space of solutions. Hyper-heuristics are increasingly used in function and combinatorial optimization. Rather than attempt to solve a problem using a fixed heuristic, a hyper-heuristic approach attempts to find a combination of heuristics that solve a problem (and in turn may be directly suitable for a class of problem instances). Hyper-heuristics have been little explored in data mining. This work presents novel hyper-heuristic approaches to data mining, by searching a space of attribute
Style APA, Harvard, Vancouver, ISO itp.
4

Bogdan, Vukobratović. "Hardware Acceleration of Nonincremental Algorithms for the Induction of Decision Trees and Decision Tree Ensembles." Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2017. https://www.cris.uns.ac.rs/record.jsf?recordId=102520&source=NDLTD&language=en.

Pełny tekst źródła
Streszczenie:
The thesis proposes novel full decision tree and decision tree ensembleinduction algorithms EFTI and EEFTI, and various possibilities for theirimplementations are explored. The experiments show that the proposed EFTIalgorithm is able to infer much smaller DTs on average, without thesignificant loss in accuracy, when compared to the top-down incremental DTinducers. On the other hand, when compared to other full tree inductionalgorithms, it was able to produce more accurate DTs, with similar sizes, inshorter times. Also, the hardware architectures for acceleration of thesealgorithms (EFTIP and E
Style APA, Harvard, Vancouver, ISO itp.
5

Qureshi, Taimur. "Contributions to decision tree based learning." Thesis, Lyon 2, 2010. http://www.theses.fr/2010LYO20051/document.

Pełny tekst źródła
Streszczenie:
Advances in data collection methods, storage and processing technology are providing a unique challenge and opportunity for automated data learning techniques which aim at producing high-level information, or models, from data. A Typical knowledge discovery process consists of data selection, data preparation, data transformation, data mining and interpretation/validation of the results. Thus, we develop automatic learning techniques which contribute to the data preparation, transformation and mining tasks of knowledge discovery. In doing so, we try to improve the prediction accuracy of the ov
Style APA, Harvard, Vancouver, ISO itp.
6

Ardeshir, G. "Decision tree simplification for classifier ensembles." Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/843022/.

Pełny tekst źródła
Streszczenie:
Design of ensemble classifiers involves three factors: 1) a learning algorithm to produce a classifier (base classifier), 2) an ensemble method to generate diverse classifiers, and 3) a combining method to combine decisions made by base classifiers. With regard to the first factor, a good choice for constructing a classifier is a decision tree learning algorithm. However, a possible problem with this learning algorithm is its complexity which has only been addressed previously in the context of pruning methods for individual trees. Furthermore, the ensemble method may require the learning algo
Style APA, Harvard, Vancouver, ISO itp.
7

Ahmad, Amir. "Data Transformation for Decision Tree Ensembles." Thesis, University of Manchester, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.508528.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Cai, Jingfeng. "Decision Tree Pruning Using Expert Knowledge." University of Akron / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=akron1158279616.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Wu, Shuning. "Optimal instance selection for improved decision tree." [Ames, Iowa : Iowa State University], 2007.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Sinnamon, Roslyn M. "Binary decision diagrams for fault tree analysis." Thesis, Loughborough University, 1996. https://dspace.lboro.ac.uk/2134/7424.

Pełny tekst źródła
Streszczenie:
This thesis develops a new approach to fault tree analysis, namely the Binary Decision Diagram (BDD) method. Conventional qualitative fault tree analysis techniques such as the "top-down" or "bottom-up" approaches are now so well developed that further refinement is unlikely to result in vast improvements in terms of their computational capability. The BDD method has exhibited potential gains to be made in terms of speed and efficiency in determining the minimal cut sets. Further, the nature of the binary decision diagram is such that it is more suited to Boolean manipulation. The BDD method h
Style APA, Harvard, Vancouver, ISO itp.
Więcej źródeł

Książki na temat "Decision tree"

1

Gladwin, Christina. Ethnographic decision tree modeling. Sage, 1989.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Gladwin, Christina H. Ethnographic decision tree modeling. Sage, 1989.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Ken, Friedman. The decision tree: A novel. Heart Pub., 1996.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Euler, Bryan L. EDDT: Emotional Disturbance Decision Tree. Psychological Assessment Resources, 2007.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Grąbczewski, Krzysztof. Meta-Learning in Decision Tree Induction. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-00960-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Association, American Bankers. Analyzing financial statements: A decision tree approach. American Bankers Association, 2013.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Barros, Rodrigo C., André C. P. L. F. de Carvalho, and Alex A. Freitas. Automatic Design of Decision-Tree Induction Algorithms. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14231-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

David, Landgrebe, and United States. National Aeronautics and Space Administration., eds. A survey of decision tree classifier methodology. School of Electrical Engineering, Purdue University, 1990.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

National Flood Proofing Committee (U.S.), ed. Flood proofing: How to evaluate your options : decision tree. US Army Corps of Engineers, National Flood Proofing Committee, 1995.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

National Flood Proofing Committee (U.S.), ed. Flood proofing: How to evaluate your options : decision tree. US Army Corps of Engineers, National Flood Proofing Committee, 1995.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Więcej źródeł

Części książek na temat "Decision tree"

1

Ayyadevara, V. Kishore. "Decision Tree." In Pro Machine Learning Algorithms. Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3564-5_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Nahler, Gerhard. "decision tree." In Dictionary of Pharmaceutical Medicine. Springer Vienna, 2009. http://dx.doi.org/10.1007/978-3-211-89836-9_366.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, et al. "Decision Tree." In Encyclopedia of Machine Learning. Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_204.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Berrar, Daniel, and Werner Dubitzky. "Decision Tree." In Encyclopedia of Systems Biology. Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4419-9863-7_611.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Li, Hang. "Decision Tree." In Machine Learning Methods. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3917-6_5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Panda, Rajendra Mohan, and B. S. Daya Sagar. "Decision Tree." In Encyclopedia of Mathematical Geosciences. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-030-85040-1_81.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Huang, Xiaowei, Gaojie Jin, and Wenjie Ruan. "Decision Tree." In Artificial Intelligence: Foundations, Theory, and Algorithms. Springer Nature Singapore, 2012. http://dx.doi.org/10.1007/978-981-19-6814-3_5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Bandyopadhyay, Susmita. "Decision Tree." In Decision Support System. CRC Press, 2023. http://dx.doi.org/10.1201/9781003307655-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Larsen, Henrik Gert. "Decision Tree." In Eight Domains of Phenomenology and Research Methods. Routledge, 2023. http://dx.doi.org/10.4324/9781003270058-12.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Sarang, Poornachandra. "Decision Tree." In Thinking Data Science. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-02363-7_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Decision tree"

1

Chen, Wenkai, Hui Zhang, Chunming Yang, Bo Li, Xujian Zhao, and Yi Yang. "Fair Causal Decision Tree." In 2025 28th International Conference on Computer Supported Cooperative Work in Design (CSCWD). IEEE, 2025. https://doi.org/10.1109/cscwd64889.2025.11033273.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Murata, Yoriaki, Kousuke Omyo, and Hitoshi Sakano. "Subspace-Based Decision Tree Retraining." In 2025 1st International Conference on Consumer Technology (ICCT-Pacific). IEEE, 2025. https://doi.org/10.1109/icct-pacific63901.2025.11012846.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Lynh, Duong Huyen, and Tran Hai Nam. "Empowering Die Selection in V-Bending: Insights from Decision Tree Algorithms." In 2024 International Conference on Machining, Materials and Mechanical Technologies. Trans Tech Publications Ltd, 2025. https://doi.org/10.4028/p-epv4bq.

Pełny tekst źródła
Streszczenie:
This study presents the application of tree-based algorithms to predict springback in the V-bending process of sheet metals, particularly for SUS304 material. V-bending, a critical process in metal forming, often faces challenges due to springback, which affects dimensional accuracy and product quality. Using virtual experiments conducted via ANSYS software, the study evaluates the influence of variables such as die angle, die radius, material thickness, and punch displacement on springback. Four tree-based algorithms—Decision Trees, Random Forest, Gradient Boosting Machines (GBM), and Extra T
Style APA, Harvard, Vancouver, ISO itp.
4

Agrawal, Kartikay, Ayon Borthakur, Ayush Kumar Singh, Perambuduri Srikaran, Digjoy Nandi, and Omkaradithya Pujari. "Neural Decision Tree for Bio-TinyML." In 2024 IEEE Biomedical Circuits and Systems Conference (BioCAS). IEEE, 2024. https://doi.org/10.1109/biocas61083.2024.10798396.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Jiang, Yongzi, Zhixiang Li, Yongjie Huang, and Zechao Xu. "Production Decision Analysis Based on Decision Tree and Bayesian Optimisation." In 2025 IEEE International Conference on Electronics, Energy Systems and Power Engineering (EESPE). IEEE, 2025. https://doi.org/10.1109/eespe63401.2025.10987118.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Vos, Daniël, and Sicco Verwer. "Optimal Decision Tree Policies for Markov Decision Processes." In Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/606.

Pełny tekst źródła
Streszczenie:
Interpretability of reinforcement learning policies is essential for many real-world tasks but learning such interpretable policies is a hard problem. Particularly, rule-based policies such as decision trees and rules lists are difficult to optimize due to their non-differentiability. While existing techniques can learn verifiable decision tree policies, there is no guarantee that the learners generate a policy that performs optimally. In this work, we study the optimization of size-limited decision trees for Markov Decision Processes (MPDs) and propose OMDTs: Optimal MDP Decision Trees. Given
Style APA, Harvard, Vancouver, ISO itp.
7

Yawata, Koichiro, Yoshihiro Osakabe, Takuya Okuyama, and Akinori Asahara. "QUBO Decision Tree: Annealing Machine Extends Decision Tree Splitting." In 2022 IEEE International Conference on Knowledge Graph (ICKG). IEEE, 2022. http://dx.doi.org/10.1109/ickg55886.2022.00052.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Desai, Ankit, and Sanjay Chaudhary. "Distributed Decision Tree." In ACM COMPUTE '16: Ninth Annual ACM India Conference. ACM, 2016. http://dx.doi.org/10.1145/2998476.2998478.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Nowozin, Sebastian, Carsten Rother, Shai Bagon, Toby Sharp, Bangpeng Yao, and Pushmeet Kohli. "Decision tree fields." In 2011 IEEE International Conference on Computer Vision (ICCV). IEEE, 2011. http://dx.doi.org/10.1109/iccv.2011.6126429.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Gavankar, Sachin S., and Sudhirkumar D. Sawarkar. "Eager decision tree." In 2017 2nd International Conference for Convergence in Technology (I2CT). IEEE, 2017. http://dx.doi.org/10.1109/i2ct.2017.8226246.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Decision tree"

1

Hamilton, Jill, and Tuan Nguyen. Asbestos Inspection/Reinspection Decision Tree. Defense Technical Information Center, 1999. http://dx.doi.org/10.21236/ada370454.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Narlikar, Girija J. A Parallel, Multithreaded Decision Tree Builder. Defense Technical Information Center, 1998. http://dx.doi.org/10.21236/ada363531.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Quiller, Ryan. Decision Tree Technique for Particle Identification. Office of Scientific and Technical Information (OSTI), 2003. http://dx.doi.org/10.2172/815649.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Mughal, Mohamed. Biological Weapons Response Template and Decision Tree. Defense Technical Information Center, 2001. http://dx.doi.org/10.21236/ada385897.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Dakin, Gordon, and Sankar Virdhagriswaran. Misleading Information Detection Through Probabilistic Decision Tree Classifiers. Defense Technical Information Center, 2002. http://dx.doi.org/10.21236/ada406823.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Kwon, Theresa Hyunjin, Erin Cho, and Youn-Kyung Kim. Identifying Sustainable Style Consumers with Decision Tree Predictive Model. Iowa State University, Digital Repository, 2016. http://dx.doi.org/10.31274/itaa_proceedings-180814-1366.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Eccleston, C. H. The decision - identification tree: A new EIS scoping tool. Office of Scientific and Technical Information (OSTI), 1997. http://dx.doi.org/10.2172/16876.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Mikulski, Dariusz G. Rough Set Based Splitting Criterion for Binary Decision Tree Classifiers. Defense Technical Information Center, 2006. http://dx.doi.org/10.21236/ada489077.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Song, So Young, Erin Cho, Youn-Kyung Kim, and Theresa Hyunjin Kwon. Clothing Communication via Social Media: A Decision Tree Predictive Model. Iowa State University, Digital Repository, 2015. http://dx.doi.org/10.31274/itaa_proceedings-180814-102.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Zaman, Md Mostafa, Theresa Hyunjin Kwon, Katrina Laemmerhirt, and Youn-Kyung Kim. Profiling Second-hand Clothing Shoppers with Decision Tree Predictive Model. Iowa State University, Digital Repository, 2017. http://dx.doi.org/10.31274/itaa_proceedings-180814-407.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!