Academic literature on the topic 'Decision tree classifier'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Decision tree classifier.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Decision tree classifier"

1

Lu, Songfeng, and Samuel L. Braunstein. "Quantum decision tree classifier." Quantum Information Processing 13, no. 3 (2013): 757–70. http://dx.doi.org/10.1007/s11128-013-0687-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Aisyah, Siti. "LOAN STATUS PREDICTION USING DECISION TREE CLASSIFIER." Power Elektronik : Jurnal Orang Elektro 13, no. 1 (2024): 68–70. http://dx.doi.org/10.30591/polektro.v12i3.6591.

Full text
Abstract:
This paper investigates the effectiveness of the Decision Tree Classifier in predicting loan status, a critical task in the financial sector. The study utilizes a dataset containing various attributes of loan applicants such as income, credit score, employment status, and loan amount. The dataset is preprocessed to handle missing values and categorical variables. Feature importance is analyzed to understand the key factors influencing loan approval decisions. A Decision Tree Classifier model is trained and evaluated using performance metrics such as accuracy, precision, recall, and F1-score. T
APA, Harvard, Vancouver, ISO, and other styles
3

Kutikuppala, Saikiran. "Decision Tree Learning Based Feature Selection and Evaluation for Image Classification." International Journal for Research in Applied Science and Engineering Technology 11, no. 6 (2023): 2668–74. http://dx.doi.org/10.22214/ijraset.2023.54035.

Full text
Abstract:
Abstract: The problem statement focuses on feature evaluation and selection for image classification using decision tree learning. The objective is to identify the most significant features in an image dataset and train a decision tree classifier using these selected features. The accuracy of an image classifier heavily relies on the quality and relevance of the features used to represent the images. Hence, it is crucial to identify the most important features and eliminate the irrelevant ones to enhance the classifier's accuracy. To implement this approach, we can utilize scikit-learn, a popu
APA, Harvard, Vancouver, ISO, and other styles
4

Kotsiantis, Sotiris. "A hybrid decision tree classifier." Journal of Intelligent & Fuzzy Systems 26, no. 1 (2014): 327–36. http://dx.doi.org/10.3233/ifs-120741.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bavani, B., S. Nirmala Sugirtha Rajini, M. S. Josephine, and V. Prasannakumari. "Heart Disease Prediction System based on Decision Tree Classifier." Journal of Advanced Research in Dynamical and Control Systems 11, no. 10-SPECIAL ISSUE (2019): 1232–37. http://dx.doi.org/10.5373/jardcs/v11sp10/20192968.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Maazouzi, Faiz, and Halima Bahi. "Using multi decision tree technique to improving decision tree classifier." International Journal of Business Intelligence and Data Mining 7, no. 4 (2012): 274. http://dx.doi.org/10.1504/ijbidm.2012.051712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bagui, Sikha S., Dustin Mink, Subhash C. Bagui, et al. "Introducing the UWF-ZeekDataFall22 Dataset to Classify Attack Tactics from Zeek Conn Logs Using Spark’s Machine Learning in a Big Data Framework." Electronics 12, no. 24 (2023): 5039. http://dx.doi.org/10.3390/electronics12245039.

Full text
Abstract:
This study introduces UWF-ZeekDataFall22, a newly created dataset labeled using the MITRE ATT&CK framework. Although the focus of this research is on classifying the never-before classified resource development tactic, the reconnaissance and discovery tactics were also classified. The results were also compared to a similarly created dataset, UWF-ZeekData22, created in 2022. Both of these datasets, UWF-ZeekDataFall22 and UWF-ZeekData22, created using Zeek Conn logs, were stored in a Big Data Framework, Hadoop. For machine learning classification, Apache Spark was used in the Big Data Frame
APA, Harvard, Vancouver, ISO, and other styles
8

Tri, Okta Priasni, and Oswari Teddy. "Comparative study of standalone classifier and ensemble classifier." TELKOMNIKA (Telecommunication, Computing, Electronics and Control) 19, no. 5 (2021): 1747–54. https://doi.org/10.12928/telkomnika.v19i5.19508.

Full text
Abstract:
Ensemble learning is one of machine learning method that can solve performance measurement problem. Standalone classifiers often show a poor performance result, thus why combining them with ensemble methods can improve their performance scores. Ensemble learning has several methods, in this study, three methods of ensemble learning are compared with standalone classifiers of support vector machine, Naïve Bayes, and decision tree. bagging, AdaBoost, and voting are the ensemble methods that are combined then compared to standalone classifiers. From 1670 dataset of twitter mentions about tou
APA, Harvard, Vancouver, ISO, and other styles
9

WINDEATT, T., and G. ARDESHIR. "DECISION TREE SIMPLIFICATION FOR CLASSIFIER ENSEMBLES." International Journal of Pattern Recognition and Artificial Intelligence 18, no. 05 (2004): 749–76. http://dx.doi.org/10.1142/s021800140400340x.

Full text
Abstract:
The goal of designing an ensemble of simple classifiers is to improve the accuracy of a recognition system. However, the performance of ensemble methods is problem-dependent and the classifier learning algorithm has an important influence on ensemble performance. In particular, base classifiers that are too complex may result in overfitting. In this paper, the performance of Bagging, Boosting and Error-Correcting Output Code (ECOC) is compared for five decision tree pruning methods. A description is given for each of the pruning methods and the ensemble techniques. AdaBoost.OC which is a combi
APA, Harvard, Vancouver, ISO, and other styles
10

Kumar, Dharmender, and N. A. Priyanka. "Decision tree classifier: a detailed survey." International Journal of Information and Decision Sciences 12, no. 3 (2020): 246. http://dx.doi.org/10.1504/ijids.2020.10029122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Decision tree classifier"

1

Ardeshir, G. "Decision tree simplification for classifier ensembles." Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/843022/.

Full text
Abstract:
Design of ensemble classifiers involves three factors: 1) a learning algorithm to produce a classifier (base classifier), 2) an ensemble method to generate diverse classifiers, and 3) a combining method to combine decisions made by base classifiers. With regard to the first factor, a good choice for constructing a classifier is a decision tree learning algorithm. However, a possible problem with this learning algorithm is its complexity which has only been addressed previously in the context of pruning methods for individual trees. Furthermore, the ensemble method may require the learning algo
APA, Harvard, Vancouver, ISO, and other styles
2

Rosten, David Paul 1967. "Automatic design of a decision tree classifier employing neural networks." Thesis, The University of Arizona, 1991. http://hdl.handle.net/10150/277881.

Full text
Abstract:
Pattern recognition problems involve two main issues: feature formulation and classifier design. This thesis is concerned with the latter. Numerous algorithms for the design of pattern recognition systems have been published, and the algorithm detailed herein is a new approach--specific to the design of decision tree classifiers. It involves a top-down strategy, optimizing the root node decision and then subsequently its children. To assess various pattern space partitions, the Tie statistical distance measure quantified the separability of potential cluster groupings. Additionally, a separate
APA, Harvard, Vancouver, ISO, and other styles
3

Thomas, Clifford S. "From 'tree' based Bayesian networks to mutual information classifiers : deriving a singly connected network classifier using an information theory based technique." Thesis, University of Stirling, 2005. http://hdl.handle.net/1893/2623.

Full text
Abstract:
For reasoning under uncertainty the Bayesian network has become the representation of choice. However, except where models are considered 'simple' the task of construction and inference are provably NP-hard. For modelling larger 'real' world problems this computational complexity has been addressed by methods that approximate the model. The Naive Bayes classifier, which has strong assumptions of independence among features, is a common approach, whilst the class of trees is another less extreme example. In this thesis we propose the use of an information theory based technique as a mechanism f
APA, Harvard, Vancouver, ISO, and other styles
4

STEFFENINO, SARA. "Urban land cover mapping using medium spatial resolution satellite imageries: effectiveness of Decision Tree Classifier." Doctoral thesis, Politecnico di Torino, 2013. http://hdl.handle.net/11583/2507348.

Full text
Abstract:
The study is inserted in the framework of information extraction from satellite imageries for supporting rapid mapping activities, where information need to be extracted quickly and the elimination, also if partially, of manual digitalization procedures, can be considered a great breakthrough. The main aim of this study was therefore to develop algorithms for the extraction of urban layer by means of medium spatial resolution Landsat data processing; Decision Tree classifier was investigated as classification techniques, thus it allows to extract rules that can be later applied to different
APA, Harvard, Vancouver, ISO, and other styles
5

ARELLANO-NERI, OLIMPIA. "AN IMPROVED METHODOLOGY FOR LAND-COVER CLASSIFICATION USING ARTIFICIAL NEURAL NETWORKS AND A DECISION TREE CLASSIFIER." University of Cincinnati / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1085589862.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Boujari, Tahereh. "Instance-based ontology alignment using decision trees." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-84918.

Full text
Abstract:
Using ontologies is a key technology in the semantic web. The semantic web helps people to store their data on the web, build vocabularies, and has written rules for handling these data and also helps the search engines to distinguish between the information they want to access in web easier. In order to use multiple ontologies created by different experts we need matchers to find the similar concepts in them to use it to merge these ontologies. Text based searches use the string similarity functions to find the equivalent concepts inside ontologies using their names.This is the method that is
APA, Harvard, Vancouver, ISO, and other styles
7

Zhong, Mingyu. "AN ANALYSIS OF MISCLASSIFICATION RATES FOR DECISION TREES." Doctoral diss., University of Central Florida, 2007. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2496.

Full text
Abstract:
The decision tree is a well-known methodology for classification and regression. In this dissertation, we focus on the minimization of the misclassification rate for decision tree classifiers. We derive the necessary equations that provide the optimal tree prediction, the estimated risk of the tree's prediction, and the reliability of the tree's risk estimation. We carry out an extensive analysis of the application of Lidstone's law of succession for the estimation of the class probabilities. In contrast to existing research, we not only compute the expected values of the risks but also calcul
APA, Harvard, Vancouver, ISO, and other styles
8

Jia, Xiuping Electrical Engineering Australian Defence Force Academy UNSW. "Classification techniques for hyperspectral remote sensing image data." Awarded by:University of New South Wales - Australian Defence Force Academy. School of Electrical Engineering, 1996. http://handle.unsw.edu.au/1959.4/38713.

Full text
Abstract:
Hyperspectral remote sensing image data, such as that recorded by AVIRIS with 224 spectral bands, provides rich information on ground cover types. However, it presents new problems in machine assisted interpretation, mainly in long processing times and the difficulties of class training due to the low ratio of number of training samples to the number of bands. This thesis investigates feasible and efficient feature reduction and image classification techniques which are appropriate for hyperspectral image data. The study is reported in three parts. The first concerns a deterministic approach
APA, Harvard, Vancouver, ISO, and other styles
9

Benda, Ondřej. "Návrh rozhodovacích stromů na základě evolučních algoritmů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2012. http://www.nusl.cz/ntk/nusl-219457.

Full text
Abstract:
Tato diplomová práce pojednává o dvou algoritmech pro dolování z proudu dat - Very Fast Decision Tree (VFDT) a Concept-adapting Very Fast Decision Tree (CVFDT). Je vysvětlen princip klasifikace rozhodovacím stromem. Je popsána základní myšlenka konstrukce stromu Hoeffding Tree, který je základem pro algoritmy VFDT a CVFDT. Tyto algoritmy jsou poté rozebrány detailněji. Dále se tato práce zabývá návrhem algoritmu Genetického Programování (GP), který je použit pro vytváření klasifikátoru obrazových dat. Vytvořený klasifikátor je použit jako alternativní způsob klasifikace objektů v obraze ve fra
APA, Harvard, Vancouver, ISO, and other styles
10

Федоров, Д. П. "Comparison of classifiers based on the decision tree." Thesis, ХНУРЕ, 2021. https://openarchive.nure.ua/handle/document/16430.

Full text
Abstract:
The main purpose of this work is to compare classifiers. Random Forest and XGBoost are two popular machine learning algorithms. In this paper, we looked at how they work, compared their features, and obtained accurate results from their robots.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Decision tree classifier"

1

David, Landgrebe, and United States. National Aeronautics and Space Administration., eds. A survey of decision tree classifier methodology. School of Electrical Engineering, Purdue University, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bondarenko, Natal'ya. Pattern recognition. The initial course of theory. INFRA-M Academic Publishing LLC., 2024. http://dx.doi.org/10.12737/2111834.

Full text
Abstract:
This tutorial discusses the tasks of pattern recognition, discriminant analysis, taxonomy, comparison with a reference, classification of features, and selection of a feature space. The main groups of features calculated from images and used for their recognition have been studied. The methods of classification based on comparison with the standard, the Bayesian classifier and decision trees are highlighted. Meets the requirements of the federal state educational standards of higher education of the latest generation. For students studying in the field of information technology, applied mathem
APA, Harvard, Vancouver, ISO, and other styles
3

Mooney, Raymond J. Machine Learning. Edited by Ruslan Mitkov. Oxford University Press, 2012. http://dx.doi.org/10.1093/oxfordhb/9780199276349.013.0020.

Full text
Abstract:
This article introduces the type of symbolic machine learning in which decision trees, rules, or case-based classifiers are induced from supervised training examples. It describes the representation of knowledge assumed by each of these approaches and reviews basic algorithms for inducing such representations from annotated training examples and using the acquired knowledge to classify future instances. Machine learning is the study of computational systems that improve performance on some task with experience. Most machine learning methods concern the task of categorizing examples described b
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Decision tree classifier"

1

Barbareschi, Mario, Salvatore Del Prete, Francesco Gargiulo, Antonino Mazzeo, and Carlo Sansone. "Decision Tree-Based Multiple Classifier Systems: An FPGA Perspective." In Multiple Classifier Systems. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20248-8_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Morin, A. M. "Decision Tree Classifier for Speech Recognition." In Compstat. Physica-Verlag HD, 1988. http://dx.doi.org/10.1007/978-3-642-46900-8_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Monemi, Alireza, Roozbeh Zarei, Muhammad Nadzir Marsono, and Mohamed Khalil-Hani. "Parameterizable Decision Tree Classifier on NetFPGA." In Advances in Intelligent Systems and Computing. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-32063-7_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Abdel Hady, Mohamed Farouk, and Friedhelm Schwenker. "Decision Templates Based RBF Network for Tree-Structured Multiple Classifier Fusion." In Multiple Classifier Systems. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-02326-2_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Alam, Fahim Irfan, Fateha Khanam Bappee, Md Reza Rabbani, and Md Mohaiminul Islam. "An Optimized Formulation of Decision Tree Classifier." In Communications in Computer and Information Science. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-36321-4_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bujnowski, Paweł, Eulalia Szmidt, and Janusz Kacprzyk. "Intuitionistic Fuzzy Decision Tree: A New Classifier." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-11313-5_68.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Barbareschi, Mario, Cristina Papa, and Carlo Sansone. "Approximate Decision Tree-Based Multiple Classifier Systems." In Springer Proceedings in Mathematics & Statistics. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-67308-0_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Salem, Abdel-Badeeh M., and Abeer M. Mahmoud. "A Hybrid Genetic Algorithm — Decision Tree Classifier." In Intelligent Information Processing and Web Mining. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-36562-4_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Chunyan, Wei Lin, and Yongtian Yang. "Packet Filtering Using a Decision Tree Classifier." In Intelligent Data Engineering and Automated Learning. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-45080-1_109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Georgiev, George, Iren Valova, and Natacha Gueorguieva. "Binary Tree Classifier Based on Kolmogorov-Smirnov Test." In Advances in Intelligent Decision Technologies. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-14616-9_55.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Decision tree classifier"

1

Kumar, Shaina Suresh, Rahul Dev, Brajesh Kumar Singh, Nishi Agarwal, and Sandeep Kumar. "FPGA-based Automatic Pill Dispenser using Decision Tree Classifier." In 2024 International Conference on Electrical Electronics and Computing Technologies (ICEECT). IEEE, 2024. http://dx.doi.org/10.1109/iceect61758.2024.10739010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pandey, Aparna, and Arvind Kumar Tiwari. "Smart Security: Unmasking Face Spoofers with Advanced Decision Tree Classifier." In 2024 15th International Conference on Computing Communication and Networking Technologies (ICCCNT). IEEE, 2024. http://dx.doi.org/10.1109/icccnt61001.2024.10726069.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Barbareschi, Mario, Salvatore Barone, Antonio Emmanuele, and Nicola Mazzocca. "Exploiting Functional Approximation on Decision-Tree Based Multiple Classifier Systems." In 2024 IFIP/IEEE 32nd International Conference on Very Large Scale Integration (VLSI-SoC). IEEE, 2024. https://doi.org/10.1109/vlsi-soc62099.2024.10767841.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Prakash, Goutam Agrawal, Malaya Dutta Borah, Vishwajeet Kumar, Purani Dharmesh Kamleshbhai, and Pawan Kumar. "A Comparative Analysis of Decision Tree Classifier Performance in the Medical Data Analysis." In 2024 First International Conference on Electronics, Communication and Signal Processing (ICECSP). IEEE, 2024. http://dx.doi.org/10.1109/icecsp61809.2024.10698424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

A, Parveen Akhther, Gopavaram Sreelekha, and Prasanth V. S. "AI Based Decision Tree Classifier for Detecting Anomalies in IoT Wireless Sensor Networks." In 2025 6th International Conference on Mobile Computing and Sustainable Informatics (ICMCSI). IEEE, 2025. https://doi.org/10.1109/icmcsi64620.2025.10883363.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Weiguo, Yi, Lu Mingyu, and Duan Jing. "VPRS based decision tree classifier." In 2011 International Conference of Soft Computing and Pattern Recognition (SoCPaR). IEEE, 2011. http://dx.doi.org/10.1109/socpar.2011.6089093.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yu, Yao, Fu Zhong-liang, Zhao Xiang-hui, and Cheng Wen-fang. "Combining Classifier Based on Decision Tree." In 2009 WASE International Conference on Information Engineering (ICIE). IEEE, 2009. http://dx.doi.org/10.1109/icie.2009.12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hassan, Muhammad, and Amine Bermak. "Gas classification using binary decision tree classifier." In 2014 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2014. http://dx.doi.org/10.1109/iscas.2014.6865700.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Levashenko, Vitaly, Elena Zaitseva, and Seppo Puuronen. "Fuzzy Classifier Based on Fuzzy Decision Tree." In EUROCON 2007 - The International Conference on "Computer as a Tool". IEEE, 2007. http://dx.doi.org/10.1109/eurcon.2007.4400614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kadappa, Vijayakumar, Shankru Guggari, and Atul Negi. "Decision Tree classifier using theme based partitioning." In 2015 International Conference on Computing and Network Communications (CoCoNet). IEEE, 2015. http://dx.doi.org/10.1109/coconet.2015.7411240.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Decision tree classifier"

1

Liu, Hongrui, and Rahul Ramachandra Shetty. Analytical Models for Traffic Congestion and Accident Analysis. Mineta Transportation Institute, 2021. http://dx.doi.org/10.31979/mti.2021.2102.

Full text
Abstract:
In the US, over 38,000 people die in road crashes each year, and 2.35 million are injured or disabled, according to the statistics report from the Association for Safe International Road Travel (ASIRT) in 2020. In addition, traffic congestion keeping Americans stuck on the road wastes millions of hours and billions of dollars each year. Using statistical techniques and machine learning algorithms, this research developed accurate predictive models for traffic congestion and road accidents to increase understanding of the complex causes of these challenging issues. The research used US Accident
APA, Harvard, Vancouver, ISO, and other styles
2

Dakin, Gordon, and Sankar Virdhagriswaran. Misleading Information Detection Through Probabilistic Decision Tree Classifiers. Defense Technical Information Center, 2002. http://dx.doi.org/10.21236/ada406823.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mikulski, Dariusz G. Rough Set Based Splitting Criterion for Binary Decision Tree Classifiers. Defense Technical Information Center, 2006. http://dx.doi.org/10.21236/ada489077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sanders, Suzanne, and Jessica Kirschbaum. Forest health monitoring at Mississippi National River and Recreation Area: 2022 field season. National Park Service, 2023. http://dx.doi.org/10.36967/2301407.

Full text
Abstract:
The Mississippi National River and Recreation area (MISS), situated along a 116 km stretch of the Mississippi River through the Minneapolis and St. Paul urban corridor, encompasses ~21,800 ha of public and private land. In 2022, the Great Lakes Inventory and Monitoring Network (GLKN) resampled permanent forest monitoring sites in the park, marking the second assessment of these sites, which were established and initially sampled in 2011. The goal of this long-term monitoring project is to provides managers with routine updates on which to base management decisions; these data can also be used
APA, Harvard, Vancouver, ISO, and other styles
5

Ley, Matt, Tom Baldvins, David Jones, Hanna Pilkington, and Kelly Anderson. Vegetation classification and mapping: Gulf Islands National Seashore. National Park Service, 2023. http://dx.doi.org/10.36967/2299028.

Full text
Abstract:
The Gulf Islands National Seashore (GUIS) vegetation inventory project classified and mapped vegetation on park-owned lands within the administrative boundary and estimated thematic map accuracy quantitatively. The project began in June 2016. National Park Service (NPS) Vegetation Mapping Inventory Program provided technical guidance. The overall process included initial planning and scoping, imagery procurement, field data collection, data analysis, imagery interpretation/classification, accuracy assessment (AA), and report writing and database development. Initial planning and scoping meetin
APA, Harvard, Vancouver, ISO, and other styles
6

Ley, Matt, Tom Baldvins, Hannah Pilkington, David Jones, and Kelly Anderson. Vegetation classification and mapping project: Big Thicket National Preserve. National Park Service, 2024. http://dx.doi.org/10.36967/2299254.

Full text
Abstract:
The Big Thicket National Preserve (BITH) vegetation inventory project classified and mapped vegetation within the administrative boundary and estimated thematic map accuracy quantitatively. National Park Service (NPS) Vegetation Mapping Inventory Program provided technical guidance. The overall process included initial planning and scoping, imagery procurement, vegetation classification field data collection, data analysis, imagery interpretation/classification, accuracy assessment (AA), and report writing and database development. Initial planning and scoping meetings took place during May, 2
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!