To see the other types of publications on this topic, follow the link: Tree approach.

Dissertations / Theses on the topic 'Tree approach'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Tree approach.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Simeone, Daniel. "Network connectivity: a tree decomposition approach." Thesis, McGill University, 2008. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=18797.

Full text
Abstract:
We show that the gap between the least costly 3-edge-connected metric graph and the least costly 3-vertex-connected metric graph is at most $3$. The approach relies upon tree decompositions, and a degree limiting theorem of Bienstock et al. As well, we explore the tree decomposition approach for general k-edge and vertex-connected graphs, and demonstrate a large amount of the required background theory.
Nous démontrons que l'écart entre un graphe métrique 3-arête-connexe de coût minimum et un graphe métrique 3-sommet-connexe de coût minimum est au plus 3. Notre approche repose sur l'existence de décompositions arborescentes et sur un théorème de Bienstock et al qui limite les degrés des sommets. De plus, nous explorons la décomposition arborescente pour le cas plus général des graphes k-arête et sommet connexes et nous exposons en grande partie les résultats nécessaires pour accéder à notre travail.
APA, Harvard, Vancouver, ISO, and other styles
2

Hossain, Mohammad Forhad. "Spanning Tree Approach On The Snow Cleaning Problem." Thesis, Högskolan Dalarna, Datateknik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:du-4847.

Full text
Abstract:
Snow cleaning is one of the important tasks in the winter time in Sweden. Every year government spends huge amount money for snow cleaning purpose. In this thesis we generate a shortest road network of the city and put the depots in different place of the city for snow cleaning. We generate shortest road network using minimum spanning tree algorithm and find the depots position using greedy heuristic. When snow is falling, vehicles start work from the depots and clean the snow all the road network of the city. We generate two types of model. Models are economic model and efficient model. Economic model provide good economical solution of the problem and it use less number of vehicles. Efficient model generate good efficient solution and it take less amount of time to clean the entire road network.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, Boning. "A THEORETIC APPROACH FOR BINARY GAME TREE EVALUATION." Case Western Reserve University School of Graduate Studies / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=case1586520140611046.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gramsci, Shantanu Khan. "A scalable video streaming approach using distributed b-tree." Thesis, University of British Columbia, 2011. http://hdl.handle.net/2429/33848.

Full text
Abstract:
Streaming video comprises the most of today’s Internet traffic, and it’s pre- dicted to increase more. Today millions of users are watching video over the Internet; video sharing sites are getting more than billion hits per day. To serve this massive user base has always been a challenging job. Over the period of time a number of approaches have been proposed, mainly in two categories - client server and peer to peer based streaming. Despite the potential scalability benefits of peer to peer systems, most popular video sharing sites today are using client server model, leveraging the caching benefits of Content Delivery Networks. In such scenarios, video files are replicated among a group of edge servers, clients’ requests are directed to an edge server instead of serving by the original video source server. The main bottle neck to this approach is that each server has a capacity limit beyond which it cannot serve properly. Instead of traditional file based streaming approach, in this thesis we pro- pose to use distributed data structure as the underlying storage for streaming video. We developed a distributed B-tree, and stored video files in the B- tree which runs over a cluster of computers and served from there. We show that system throughput increases almost linearly when more computers are added to the system.
APA, Harvard, Vancouver, ISO, and other styles
5

Nigh, Gordon Donald. "A process oriented approach to modelling forest tree mortality /." Toronto, 1994. http://opac.nebis.ch/cgi-bin/showAbstract.pl?u20=0315927828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tomek, Michal. "A stochastic tree approach to pricing multidimensional American options." Thesis, Imperial College London, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.429210.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Lu. "Pricing energy path-dependent option using tree based approach." Thesis, Imperial College London, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.512006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

SOBRAL, ANA PAULA BARBOSA. "HOURLY LOAD FORECASTING A NEW APPROACH THROUGH DECISION TREE." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2003. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=3710@1.

Full text
Abstract:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
A importância da previsão de carga a curto prazo (até uma semana à frente) em crescido recentemente. Com os processos de privatização e implantação de ompetição no setor elétrico brasileiro, a previsão de tarifas de energia vai se tornar extremamente importante. As previsões das cargas elétricas são fundamentais para alimentar as ferramentas analíticas utilizadas na sinalização das tarifas. Em conseqüência destas mudanças estruturais no setor, a variabilidade e a não-estacionaridade das cargas elétricas tendem a aumentar devido à dinâmica dos preços da energia. Em função das mudanças estruturais do setor elétrico, previsores mais autônomos são necessários para o novo cenário que se aproxima. As ferramentas disponíveis no mercado internacional para previsão de carga elétrica requerem uma quantidade significativa de informações on-line, principalmente no que se refere a dados meteorológicos. Como a realidade brasileira ainda não permite o acesso a essas informações será proposto um previsor de carga para o curto-prazo, considerando restrições na aquisição dos dados de temperatura. Logo, tem-se como proposta um modelo de previsão de carga horária de curto prazo (um dia a frente) empregando dados de carga elétrica e dados meteorológicos (temperatura) através de modelos de árvore de decisão. Decidiu-se pelo modelo de árvore de decisão, pois este modelo além de apresentar uma grande facilidade de interpretação dos resultados, apresenta pouquíssima ênfase em sua utilização na área de previsão de carga elétrica.
The importance of load forecasting for the short term (up to one-week ahead) has been steadily growing in the last years. Load forecasts are the basis for the forecasting of energy prices, and the privatisation, and the introduction of competitiveness in the Brazilian electricity sector, have turned price forecasting into an extremely important task. As a consequence of structural changes in the electricity sector, the variability and the non-stationarity of the electrical loads have tended to increase, because of the dynamics of the energy prices. As a consequence of these structural changes, new forecasting methods are needed to meet the new scenarios. The tools that are available for load forecasting in the international market require a large amount of online information, specially information about weather data. Since this information is not yet readily available in Brazil, this thesis proposes a short-term load forecaster that takes into consideration the restrictions in the acquisition of temperature data. A short-term (one-day ahead) forecaster of hourly loads is proposed that combines load data and weather data (temperature), by means of decision tree models. Decision trees were chosen because those models, despite being easy to interpret, have been very rarely used for load forecasting.
APA, Harvard, Vancouver, ISO, and other styles
9

Alnatsheh, Rami H. "Frequent Itemset Hiding Algorithm Using Frequent Pattern Tree Approach." NSUWorks, 2012. http://nsuworks.nova.edu/gscis_etd/76.

Full text
Abstract:
A problem that has been the focus of much recent research in privacy preserving data-mining is the frequent itemset hiding (FIH) problem. Identifying itemsets that appear together frequently in customer transactions is a common task in association rule mining. Organizations that share data with business partners may consider some of the frequent itemsets sensitive and aim to hide such sensitive itemsets by removing items from certain transactions. Since such modifications adversely affect the utility of the database for data mining applications, the goal is to remove as few items as possible. Since the frequent itemset hiding problem is NP-hard and practical instances of this problem are too large to be solved optimally, there is a need for heuristic methods that provide good solutions. This dissertation developed a new method called Min_Items_Removed, using the Frequent Pattern Tree (FP-Tree) that outperforms extant methods for the FIH problem. The FP-Tree enables the compression of large databases into significantly smaller data structures. As a result of this compression, a search may be performed with increased speed and efficiency. To evaluate the effectiveness and performance of the Min_Items_Removed algorithm, eight experiments were conducted. The results showed that the Min_Items_Removed algorithm yields better quality solutions than extant methods in terms of minimizing the number of removed items. In addition, the results showed that the newly introduced metric (normalized number of leaves) is a very good indicator of the problem size or difficulty of the problem instance that is independent of the number of sensitive itemsets.
APA, Harvard, Vancouver, ISO, and other styles
10

Villori, Narasiman C. "Distributed degree-constrained application-level multicast tree a partitioning approach /." Cincinnati, Ohio : University of Cincinnati, 2008. http://rave.ohiolink.edu/etdc/view.cgi?acc_num=ucin1205964934.

Full text
Abstract:
Thesis (M.S.)--University of Cincinnati, 2008.
Advisor: Fred Annexstein . Title from electronic thesis title page (viewed Oct.23, 2008). Includes abstract. Keywords: Degree-constrained; Multicast; spanning tree; partitioning. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
11

Ramineni, Narahari. "Tree Restructuring Approach to Mapping Problem in Cellular Architecture FPGAS." PDXScholar, 1995. https://pdxscholar.library.pdx.edu/open_access_etds/4914.

Full text
Abstract:
This thesis presents a new technique for mapping combinational circuits to Fine-Grain Cellular-Architecture FPGAs. We represent the netlist as the binary tree with decision variables associated with each node of the tree. The functionality of the tree nodes is chosen based on the target FPGA architecture. The proposed tree restructuring algorithms preserve local connectivity and allow direct mapping of the trees to the cellular array, thus eliminating the traditional routing phase. Also, predictability of the signal delays is a very important advantage of the developed approach. The developed bus-assignment algorithm efficiently utilizes the medium distance routing resources (buses). The method is general and can be used for any Fine Grain CA-type FPGA. To demonstrate our techniques, ATMEL 6000 series FPGA was used as a target architecture. The area and delay comparison between our methods and commercial tools is presented using a set of MCNC benchmarks. Final layouts of the implemented designs are included. Results show that the proposed techniques outperform the available commercial tools for ATMEL 6000 FPGAs, both in area and delay optimization.
APA, Harvard, Vancouver, ISO, and other styles
12

Villora, Narasiman C. "Distributed degree-constrained application-level multicast tree: A partitioning approach." University of Cincinnati / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1205964934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Battaglia, Michael J. "A multi-methods approach to determining appropriate locations for tree planting in two of Baltimore's tree-poor neighborhoods." Ohio : Ohio University, 2010. http://www.ohiolink.edu/etd/view.cgi?ohiou1275679254.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Wang, Jingjing. "A PARALLEL APPROACH TO MULTIPLE SEQUENCES ALIGNMENT AND PHYLOGENETIC TREE LABELING." OpenSIUC, 2010. https://opensiuc.lib.siu.edu/theses/246.

Full text
Abstract:
An evolutionary tree represents the relationship among a group of species, DNA or protein sequences, and play fundamental roles in biological lineage research. A high quality tree construction relies heavily on optimal multiple sequence alignment (MSA), which aligns three or more sequence simultaneously to derive the similarity. On the other hand, a good tree can also be used to guide the MSA process. Due to the high computational cost to conduct both the MSA and tree construction, parallel approaches are exploited to utilize the enormous amount of computing power and memory housed in a supercomputer or Linux cluster. In this paper, first of all, a divide and conquer based parallel algorithm is designed and implemented to perform optimal three sequence alignment using reduced memory cost. Secondly, all internal nodes of a phylogenetic tree resulting from a parallel Maximum-likelihood inference software are labeled using the parallel MSA. Such tree node labeling process is carried out from top down and is also parallelized to fully utilize the numerous cores and nodes in a high performance computing facility.
APA, Harvard, Vancouver, ISO, and other styles
15

Hancock, Wayne Mitchell, of Western Sydney Hawkesbury University, and of Science Technology and Agriculture Faculty. "Towards a farming systems approach to tree nut research in Malawi." THESIS_FSTA_XXX_Hancock_W.xml, 1992. http://handle.uws.edu.au:8081/1959.7/413.

Full text
Abstract:
This thesis covers years of field work in Malawi, Africa by the author as a Research Agronomist (Tree Nuts) for the Government of Malawi. The thesis is an action research type with core and thesis projects which are closely linked. The client group are large estate managers who control the tree nut industries in Malawi. The political, economic and historical perspectives are different from those commonly faced by Australian agronomists and the isolated location of the work make this a unique study. The thesis includes sections on plantation or estate agriculture, farming systems approaches to research and problem solving, systems concepts in agricultural settings and action research concepts. These provide a framework for the study within the constraints of the government research system and industry expectations. The body of the thesis is a review paper presented to estate managers and co-researchers after one year's work. Relevant outcomes of the study are presented. The discussion draws together the outcomes through reflection on the process and methods used. Advantages and disadvantages are considered and risks, such as the dangers to the researcher of this type of study, are highlighted.
Master of Science (Hons)
APA, Harvard, Vancouver, ISO, and other styles
16

Hancock, Wayne Mitchell. "Towards a farming systems approach to tree nut research in Malawi /." View thesis, 1992. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030616.121740/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Needham, Donald Michael. "A formal approach to hazard decomposition in Software Fault Tree Analysis." Thesis, Monterey, California: Naval Postgraduate School, 1990. http://hdl.handle.net/10945/28230.

Full text
Abstract:
As digital control systems are used in life-critical applications, assessment of the safety of these control systems becomes increasingly important. One means of formally performing this assessment is through fault tree analysis. Software Fault Tree Analysis (SFTA) starts with a system-level hazard that must be decomposed in a largely-human-intensive manner until specific modules of the software system are indicated. These modules can then be formally analyzed using statement templates. The focus of this thesis is to approach the decomposition of a system-level hazard from a formalized standpoint. Decomposition primarily proceeds along two distinct but interdependent dimensions, specificity of event and subsystem size. The Specificity-of-Event dimension breaks abstract or combined events into the specific system events that must be analyzed by the fault tree. The Subsystem-Size dimension deals with the scope of the hazard, and itemizes the subsystems where localized events may lead to the hazard. Decomposition templates are developed in this thesis to provide a framework for decomposing a system-level hazard to the point at which line-by-line code analysis can be conducted with existing statement templates. These templates serve as guides for conducting the decomposition, and ensure that as many as possible of all the applicable decomposition aspects are evaluated
APA, Harvard, Vancouver, ISO, and other styles
18

Picoco, Claudia. "Integrated Framework for Representing Recoveries Using the Dynamic Event Tree Approach." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu155550242815033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Johansson, Henrik. "Video Flow Classification : Feature Based Classification Using the Tree-based Approach." Thesis, Karlstads universitet, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-43012.

Full text
Abstract:
This dissertation describes a study which aims to classify video flows from Internet network traffic. In this study, classification is done based on the characteristics of the flow, which includes features such as payload sizes and inter-arrival time. The purpose of this is to give an alternative to classifying flows based on the contents of their payload packets. Because of an increase of encrypted flows within Internet network traffic, this is a necessity. Data with known class is fed to a machine learning classifier such that a model can be created. This model can then be used for classification of new unknown data. For this study, two different classifiers are used, namely decision trees and random forest. Several tests are completed to attain the best possible models. The results of this dissertation shows that classification based on characteristics is possible and the random forest classifier in particular achieves good accuracies. However, the accuracy of classification of encrypted flows was not able to be tested within this project.
HITS, 4707
APA, Harvard, Vancouver, ISO, and other styles
20

Agarwal, Khushbu. "A partition based approach to approximate tree mining a memory hierarchy perspective /." Columbus, Ohio : Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1196284256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Li, XuQin. "Data mining using intelligent systems : an optimized weighted fuzzy decision tree approach." Thesis, University of Warwick, 2010. http://wrap.warwick.ac.uk/38159/.

Full text
Abstract:
Data mining can be said to have the aim to analyze the observational datasets to find relationships and to present the data in ways that are both understandable and useful. In this thesis, some existing intelligent systems techniques such as Self-Organizing Map, Fuzzy C-means and decision tree are used to analyze several datasets. The techniques are used to provide flexible information processing capability for handling real-life situations. This thesis is concerned with the design, implementation, testing and application of these techniques to those datasets. The thesis also introduces a hybrid intelligent systems technique: Optimized Weighted Fuzzy Decision Tree (OWFDT) with the aim of improving Fuzzy Decision Trees (FDT) and solving practical problems. This thesis first proposes an optimized weighted fuzzy decision tree, incorporating the introduction of Fuzzy C-Means to fuzzify the input instances but keeping the expected labels crisp. This leads to a different output layer activation function and weight connection in the neural network (NN) structure obtained by mapping the FDT to the NN. A momentum term was also introduced into the learning process to train the weight connections to avoid oscillation or divergence. A new reasoning mechanism has been also proposed to combine the constructed tree with those weights which had been optimized in the learning process. This thesis also makes a comparison between the OWFDT and two benchmark algorithms, Fuzzy ID3 and weighted FDT. SIx datasets ranging from material science to medical and civil engineering were introduced as case study applications. These datasets involve classification of composite material failure mechanism, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) signals, eye bacteria prediction and wave overtopping prediction. Different intelligent systems techniques were used to cluster the patterns and predict the classes although OWFDT was used to design classifiers for all the datasets. In the material dataset, Self-Organizing Map and Fuzzy C-Means were used to cluster the acoustic event signals and classify those events to different failure mechanism, after the classification, OWFDT was introduced to design a classifier in an attempt to classify acoustic event signals. For the eye bacteria dataset, we use the bagging technique to improve the classification accuracy of Multilayer Perceptrons and Decision Trees. Bootstrap aggregating (bagging) to Decision Tree also helped to select those most important sensors (features) so that the dimension of the data could be reduced. Those features which were most important were used to grow the OWFDT and the curse of dimensionality problem could be solved using this approach. The last dataset, which is concerned with wave overtopping, was used to benchmark OWFDT with some other Intelligent Systems techniques, such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Genetic Neural Mathematical Method (GNMM) and Fuzzy ARTMAP. Through analyzing these datasets using these Intelligent Systems Techniques, it has been shown that patterns and classes can be found or can be classified through combining those techniques together. OWFDT has also demonstrated its efficiency and effectiveness as compared with a conventional fuzzy Decision Tree and weighted fuzzy Decision Tree.
APA, Harvard, Vancouver, ISO, and other styles
22

Desai, Ishani M. Eng Massachusetts Institute of Technology. "Designing structures with tree forks : mechanical characterization and generalized computational design approach." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/127284.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, May, 2020
Cataloged from the official PDF of thesis.
Includes bibliographical references (pages 80-83).
Timber structures have seen a resurgence in structural design in recent years due to a desire to reduce embodied carbon in the built environment. While many of these structures use standardized or regular elements, the recent revolution in digital fabrication has resulted in a variety of more complex and irregular timber forms, usually achieved through milling or other machine-driven production processes. However, the organic nature of wood has also inspired architects and engineers to harness naturally occurring formal variation, for example, in the geometries of tree forks and branches, to produce designs that are more directly responsive to their constitutive materials. Compared to conventional fabrication processes for timber, in which the material is often processed several times to achieve characteristics that are present in the original material, this approach embodies little waste in material and effort.
Naturally occurring branching tree forks seem to exhibit outstanding strength and material efficiency as a natural moment connection, which underpins previous research investigating their use in design. This thesis advances the use of tree forks as a natural connection in structures through two specific contributions. First, the paper establishes a flexible matching-based methodology for designing structures with a pre-existing library of tree fork nodes (based on actual available materials from salvaged trees, for example), balancing an initial target design, node matching quality, and structural performance. The methodology uses a combination of Iterative Closest Point and Hungarian Algorithms as a real-time computational approach for matching nodes in the library to nodes in the design.
The thesis presents results that systematically test this methodology by studying how matching quality varies depending on the number and species of tree forks available in the library and relates this back to the mechanical properties of tree branches found through physical testing. Second, mechanical laboratory testing of tree fork nodes of various tree species (available locally in the area) is presented to quantify the structural capacity of these connections and observe the behavior under tree fork load transfers. A structural score is developed to characterize the tolerance of tree fork nodes to imperfect matches in terms of structural capacity; these resulting geometries are compared to the previous matching-based scoring system. The resulting approach is projected forward as a framework for a more general computational approach for designing with existing material systems and geometries that can also be expanded beyond tree forks.
by Ishani Desai.
M. Eng.
M.Eng. Massachusetts Institute of Technology, Department of Civil and Environmental Engineering
APA, Harvard, Vancouver, ISO, and other styles
23

Zhang, Yinghua. "Stock Market Network Topology Analysis Based on a Minimum Spanning Tree Approach." Bowling Green State University / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1245347181.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Agarwal, Khushbu. "A partition based approach to approximate tree mining : a memory hierarchy perspective." The Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=osu1196284256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

COOK, EDWARD ROGER. "A TIME SERIES ANALYSIS APPROACH TO TREE RING STANDARDIZATION (DENDROCHRONOLOGY, FORESTRY, DENDROCLIMATOLOGY, AUTOREGRESSIVE PROCESS)." Diss., The University of Arizona, 1985. http://hdl.handle.net/10150/188110.

Full text
Abstract:
The problem of standardizing closed-canopy forest ringwidth series is investigated. A biological model for the tree-ring standardization problem indicated that one class of non-climatic variance frequently responsible for standardization problems could be objectively minimized in theory. This is the variance caused by endogenous stand disturbances which create fluctuations in ringwidth series that are non-synchronous or out-of-phase when viewed across trees in a stand. A time series method based on the autoregressive process is developed which minimizes the timewise influence of endogenous disturbances in detrended ringwidth series. Signal-to-noise ratio (SNR) properties of this method are derived which indicate that autoregressive modelling and prewhitening of detrended ringwidth indices will result in a higher SNR when endogenous disturbances are present in the series. This enables the verification of the SNR theory and the error variance reduction property of the standardization method.
APA, Harvard, Vancouver, ISO, and other styles
26

Jones, Todd R. "SigTree: An Automated Meta-Analytic Approach to Find Significant Branches in a Phylogenetic Tree." DigitalCommons@USU, 2012. https://digitalcommons.usu.edu/etd/1314.

Full text
Abstract:
An experiment involving two treatment groups compared whole wheat diet and refined wheat diet on different sets of mice. Of interest were the differences by treatment of the levels of hundreds of bacteria in the guts of the mice. It was desired to determine the statistical significance of not only the individual bacteria, but also the families of bacteria. These family relationships are represented in a phylogenetic tree, and it was determined helpful to color the branches of bacteria based on the significance of their corresponding families. Calculating these p-values and coloring the branches by hand would not be a quick process. An automated method would greatly increase the efficiency of these calculations. To handle this problem, SigTree, an R package, was written. The p-values for individual bacteria (tips) are combined up the tree using meta-analysis methods, and significance is visualized on a color scale in a revised phylogenetic tree plot. SigTree is able to handle not only the motivating mouse diet experiment, but also experiments that fall into the general framework of having significance tests (and resulting p-values) on each tip in a phylogenetic tree.
APA, Harvard, Vancouver, ISO, and other styles
27

Howells, Michael C. "A cluster-proof approach to yield enhancement of large area binary tree architectures /." Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=66194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Chang, Namsik. "Knowledge discovery in databases with joint decision outcomes: A decision-tree induction approach." Diss., The University of Arizona, 1995. http://hdl.handle.net/10150/187227.

Full text
Abstract:
Inductive symbolic learning algorithms have been used successfully over the years to build knowledge-based systems. One of these, a decision-tree induction algorithm, has formed the central component in several commercial packages because of its particular efficiency, simplicity, and popularity. However, the decision-tree induction algorithms developed thus far are limited to domains where each decision instance's outcome belongs to only a single decision outcome class. Their goal is merely to specify the properties necessary to distinguish instances pertaining to different decision outcome classes. These algorithms are not readily applicable to many challenging new types of applications in which decision instances have outcomes belonging to more than one decision outcome class (i.e., joint decision outcomes). Furthermore, when applied to domains with a single decision outcome, these algorithms become less efficient as the number of the pre-defined outcome classes increases. The objective of this dissertation is to modify previous decision-tree induction techniques in order to apply them to applications with joint decision outcomes. We propose a new decision-tree induction approach called the Multi-Decision-Tree Induction (MDTI) approach. Data was collected for a patient image retrieval application where more than one prior radiological examination would be retrieved based on characteristics of the current examination and patient status. We present empirical comparisons of the MDTI approach with the Backpropagation network algorithm and the traditional knowledge-engineer-driven knowledge acquisition approach, using the same set of cases. These comparisons are made in terms of recall rate, precision rate, average number of prior examinations suggested, and understandability of the acquired knowledge. The results show that the MDTI approach outperforms the Backpropagation network algorithms and is comparable to the traditional approach in all performance measures considered, while requiring much less learning time than either approach. To gain analytical and empirical insights into MDTI, we have compared this approach with the two best known symbolic learning algorithms (i.e., ID3 and AQ) using data domains with a single decision outcome. It has been found analytically that rules generated by the MDTI approach are more general and supported by more instances in the training set. Four empirical experiments have supported the findings.
APA, Harvard, Vancouver, ISO, and other styles
29

Looney, Jerry Wayne. "The Arkansas approach to competency to stand trial : "nailing jelly to a tree" /." abstract and full text PDF (UNR users only), 2008. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1461540.

Full text
Abstract:
Thesis (M.J.S.)--University of Nevada, Reno, 2008.
"December, 2008." Includes bibliographical references (leaves 53-57). Library also has microfilm. Ann Arbor, Mich. : ProQuest Information and Learning Company, [2009]. 1 microfilm reel ; 35 mm. Online version available on the World Wide Web.
APA, Harvard, Vancouver, ISO, and other styles
30

Au, Peter King Pong. "Hierarchical tree approach to group key management using the group Diffie-Hellman protocol." Thesis, University of British Columbia, 2007. http://hdl.handle.net/2429/31736.

Full text
Abstract:
As a result of the increasing popularity of group-oriented peer-to-peer applications, there is an increasing demand for security services in this kind of environment. Key management plays an important foundation role in security services. Quite a few key management solutions have been proposed for peer groups. However, they usually have limited scalability. This paper considers the scalability of a group-oriented environment and proposes a group hierarchy solution to resolve the problem.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
31

Anchukaitis, Kevin John. "A Stable Isotope Approach to Neotropical Cloud Forest Paleoclimatology." Diss., The University of Arizona, 2007. http://hdl.handle.net/10150/195637.

Full text
Abstract:
Many tropical trees do not form reliable annual growth rings, making it a challenge to develop tree-ring width chronologies for application to paleoclimatology in these regions. Here, I seek to establish high-resolution proxy climate records from trees without rings from the Monteverde Cloud Forest in Costa Rica using stable isotope dendroclimatology. Neotropical cloud forest ecosystems are associated with a relatively narrow range of geographic and hydroclimatic conditions, and are potentially sensitive to climate variability and change at time scales from annual to centennial and longer. My approach takes advantage of seasonal changes in the d18O of water sources used by trees over a year, a signature that is imparted to the radial growth and provides the necessary chronological control. A rapid wood extraction technique is evaluated and found to produce cellulose with d18O values indistinguishable from conventional approaches, although its application to radiocarbon requires a statistical correction. Analyses of plantation-grown Ocotea tenera reveal coherent annual d18O cycles up to 9 permil. The width of these cycles corresponds to observed basal growth increments. Interannual variability in d18O at this site is correlated with wet season precipitation anomalies. At higher elevations within the orographic cloud bank, year-to-year changes in the amplitude of oxygen isotope cycles show a relationship with dry season climate. Longer d18O chronologies from mature Pouteria (Sapotacae) reveal that dry season hydroclimatology is controlled at interannual time scales by variability in the eastern equatorial Pacific (ENSO) and the Western Hemisphere Warm Pool (WHWP), which are correlated with trade wind strength and local air temperature. A change in the late 1960s toward enhanced annual d18O amplitude may reflect low frequency changes in the Atlantic and Pacific ocean-atmosphere system. This study establishes the basis for cloud forest isotope dendroclimatology and demonstrates that the local climate of neotropical cloud forests is sensitive to interannual, and perhaps, multidecadal changes in important large-scale modes of climate variability.
APA, Harvard, Vancouver, ISO, and other styles
32

Iliskovic, Sinisa A. "Data mining in databases, an extended decision tree approach and methodology in database environment." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0010/NQ59976.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Honarvar, Pauline. "A spatial approach to mineral potential modelling using decision tree and logistic regression analysis /." Internet access available to MUN users only, 2001. http://collections.mun.ca/u?/theses,51228.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Wan, Mohd Jaafar Wan Shafrina Binti. "Individual tree detection and modelling above-ground biomass and forest parameters using discrete return airborne LiDAR data." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/31143.

Full text
Abstract:
Individual tree detection and modelling forest parameters using Airborne Laser Scanner data (Light Detection and Ranging (LiDAR) is becoming increasingly important for the monitoring and sustainable management of forests. Remote sensing has been a useful tool for individual tree analysis in the past decade, although inadequate spatial resolution from satellites means that only airborne systems have sufficient spatial resolution to conduct individual tree analysis. Moreover, recent advances in airborne LiDAR now provide high horizontal resolution as well as information in the vertical dimension. However, it is challenging to fully exploit and utilize small-footprint LiDAR data for detailed tree analysis. Procedures for forest biomass quantification and forest attributes measurement using LiDAR data have improved at a rapid pace as more robust and sophisticated modelling used to improve the studies. This thesis contains an evaluation of three approaches of utilizing LiDAR data for individual tree forest measurement. The first explores the relationship between LiDAR metrics and field reference to assess the correlation between LiDAR and field data at the individual-tree level. The intention was not to detect trees automatically, but to develop a LiDAR-AGB model based on trees that were mapped in the field so as to evaluate the relationships between LiDAR-type metrics under controlled conditions for the study sites, and field-derived AGB. A non-linear AGB model based on field data and LiDAR data was developed and LiDAR height percentile h80 and crown width measurement (CW) was found to best fit the data as evidenced by and Adj-R2 value of 0.63, the root mean squared error of the model of 14.8% and analysis of the residuals. This paper provides the foundation for a predictive LiDAR-AGB model at tree level over two study sites, Pasoh Forest Reserve and FRIM Forest Reserve. The second part of the thesis then takes this AGB-LiDAR relationship and combines it with individual tree crown delineation. This chapter shows the contribution of performing an automatic individual tree crown delineation over the wider forest areas. The individual tree crown delineation is composed of a five-step framework, which is unique in its automated determination of dominant crown sizes in a forest area and its adaption of the LiDAR-AGB model developed for the purpose of validation the method. This framework correctly delineated 84% and 88% of the tree crowns in the two forest study areas which is mostly dominated with lowland dipterocarp trees. Thirdly, parametric and non-parametric modelling approaches are proposed for modelling forest structural attributes. Selected modelling methods are compared for predicting 4 forest attributes, volume (V), basal area (BA), height (Ht) and aboveground biomass (AGB) at the species level. The AGB modelling in this paper is extracted using the LiDAR derived variables from the automated individual tree crown delineation, in contrast to the earlier AGB modelling where it is derived based on the trees that were mapped in the field. The selected non-parametric method included, k-nearest neighbour (k-NN) imputation methods: Most Similar Neighbour (MSN) and Gradient Nearest Neighbour (GNN), Random Forest (RF) and parametric approach: Ordinary Least Square (OLS) regression. To compare and evaluate these approaches a scaled root mean squared error (RMSE) between observed and predicted forest attribute sampled from both forest site was computed. The best method varied according to response variable and performance measure. OLS regression was to found to be the best performance method overall evidenced by RMSE after cross validation for BA (1.40 m2), V (1.03 m3), Ht (2.22 m) and AGB (96 Kg/tree) respectively, showed its applicability to wider conditions, while RF produced best overall results among the non-parametric methods tested. This thesis concludes with a discussion of the potential of LiDAR data as an independent source of important forest inventory data source when combined with appropriate designed sample plots in the field, and with appropriate modelling tools.
APA, Harvard, Vancouver, ISO, and other styles
35

Idefeldt, Jim. "An applied approach to numerically imprecise decision making." Doctoral thesis, Mittuniversitetet, Institutionen för informationsteknologi och medier, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-7147.

Full text
Abstract:
Despite the fact that unguided decision making might lead to inefficient and nonoptimal decisions, decisions made at organizational levels seldom utilise decisionanalytical tools. Several gaps between the decision-makers and the computer baseddecision tools exist, and a main problem in managerial decision-making involves the lack of information and precise objective data, i.e. uncertainty and imprecision may be inherent in the decision situation. We believe that this problem might be overcome by providing computer based decision tools capable of handling the uncertainty inherent in real-life decision-making. At present, nearly all decision analytic software is only able to handle precise input, and no known software is capable of handling full scale imprecision, i.e. imprecise probabilities, values and weights, in the form of interval and comparative statements. There are, however, some theories which are able to handle some kind of uncertainty, and which deal with computational and implementational issues, but if they are never actually operationalised, they are of little real use for a decision-maker. Therefore, a natural question is how a reasonable decision analytical framework can be built based on prevailing interval methods, thus dealing with the problems of uncertain and imprecise input? Further, will the interval approach actually prove useful? The framework presented herein handles theoretical foundations for, and implementations of, imprecise multi-level trees, multi-criteria, risk analysis, together with several different evaluation options. The framework supports interval probabilities, values, and criteria weights, as well as comparative statements, also allowing for mixing probabilistic and multi-criteria decisions. The framework has also been field tested in a number of studies, proving the usefulness of the interval approach.
APA, Harvard, Vancouver, ISO, and other styles
36

Hussain, Faheem Akhtar. "QOS Multimedia Multicast Routing: A Component Based Primal Dual Approach." Digital Archive @ GSU, 2006. http://digitalarchive.gsu.edu/cs_theses/37.

Full text
Abstract:
The QoS Steiner Tree Problem asks for the most cost efficient way to multicast multimedia to a heterogeneous collection of users with different data consumption rates. We assume that the cost of using a link is not constant but rather depends on the maximum bandwidth routed through the link. Formally, given a graph with costs on the edges, a source node and a set of terminal nodes, each one with a bandwidth requirement, the goal is to find a Steiner tree containing the source, and the cheapest assignment of bandwidth to each of its edges so that each source-to-terminal path in the tree has bandwidth at least as large as the bandwidth required by the terminal. Our main contributions are: (1) New flow-based integer linear program formulation for the problem; (2) First implementation of 4.311 primal-dual constant factor approximation algorithm; (3) an extensive experimental study of the new heuristics and of several previously proposed algorithms.
APA, Harvard, Vancouver, ISO, and other styles
37

Evans, Liam. "Experience-based decision support methodology for manufacturing technology selection : a fuzzy-decision-tree mining approach." Thesis, University of Nottingham, 2013. http://eprints.nottingham.ac.uk/13719/.

Full text
Abstract:
Manufacturing companies must invest in new technologies and processes to succeed in a rapidly changing global environment. Managers have the difficulty of justifying capital investment in adopting new, state-of-the-art technology. Technology investment accounts for a large part of capital spending and is a key form of improving competitive advantage. Typical approaches focus on the expected return of investment and financial reward gained from the implementation of such equipment. With an increasingly dynamic market environment and global economic model, forecasting of financial payback can be argued to become increasingly less accurate. Subsequently, less quantifiable factors are becoming increasingly important. For example, the alignment of a technology with an organisations objective to fulfil future potential and gain competitive advantage is becoming as crucial as economic evaluation. In addition, the impact on human operators and skill level required must be considered. This research was motivated by the lack of decision methodologies that understand why a technology is more successful within an environment rather than re-examining the underlying performance attributes of a technology. The aim is to create a common approach where both experts and non-experts can use historical decision information to support the evaluation and selection of an optimal manufacturing technology. This form of approach is based on the logic in which a decision maker would irrationally recall previous decisions to identify relationships with new problem cases. The work investigates data mining and machine learning techniques to discover the underlying influences to improve technology selection under a set of dynamic factors. The approach initially discovers the practices to which an expert would conduct the selection of a manufacturing technology within industry. A defined understanding of the problem and techniques was subsequently concluded. This led to an understanding of the structure by which historical decision information is recalled by an expert to support new selection problems. The key attributes in the representation of a case were apparent and a form of characterising tangible and intangible variables was justified. This led to the development of a novel, experience-based manufacturing technology selection framework using fuzzy-decision-trees. The methodology is an iterative approach of learning from previously implemented technology cases. Rules and underlying knowledge of the relationships in past cases predicts the outcome of new decision problems. The link of information from a multitude of historical cases may identify those technologies with technical characteristics that perform optimally for projects with unique requirements. This also indicates the likeliness of technologies performing successfully based on the project requirements. Historical decision cases are represented through original project objectives, technical performance attributes of the chosen technology and judged project performance. The framework was shown to provide a comprehensive foundation for decision support that reduces the uncertainty and subjective influence within the selection process. The model was developed with industrial guidance to represent the actions of a manufacturing expert. The performance of the tool was measured by industrial experts. The approach was found to represent well the decision logic of a human expert based on their developed experience through cases. The application to an industrial decision case study demonstrated encouraging results and use by decision makers feasible. The model reduces the subjectivity in the process by using case information that is formed from multiple experts of a prior decision case. The model is applied in a shorter time period than existing practices and the ranking of potential solutions is well aligned to the understanding of a decision maker. To summarise, this research highlights the importance of focusing on less quantifiable factors and the performance of a technology to a specific problem/environment. The arrangement of case information thus represents the experience an expert would acquire and recall as part of the decision process.
APA, Harvard, Vancouver, ISO, and other styles
38

Zwack, Mathew R. "CONTRAST: A conceptual reliability growth approach for comparison of launch vehicle architectures." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/53095.

Full text
Abstract:
In 2004, the NASA Astronaut Office produced a memo regarding the safety of next generation launch vehicles. The memo requested that these vehicles have a probability of loss of crew of at most 1 in 1000 flights, which represents nearly an order of magnitude decrease from current vehicles. The goal of LOC of 1 in 1000 flights has since been adopted by the launch vehicle design community as a requirement for the safety of future vehicles. This research addresses the gap between current vehicles and future goals by improving the capture of vehicle architecture effects on reliability and safety. Vehicle architecture pertains to the physical description of the vehicle itself, which includes manned or unmanned, number of stages, number of engines per stage, engine cycle types, redundancy, etc. During the operations phase of the vehicle life-cycle it is clear that each of these parameters will have an inherent effect on the reliability and safety of the vehicle. However, the vehicle architecture is typically determined during the early conceptual design phase when a baseline vehicle is selected. Unless a great amount of money and effort is spent, the architecture will remain relatively constant from conceptual design through operations. Due to the fact that the vehicle architecture is essentially “locked-in” during early design, it is expected that much of the vehicle's reliability potential will also be locked-in. This observation leads to the conclusion that improvement of vehicle reliability and safety in the area of vehicle architecture must be completed during early design. Evaluation of the effects of different architecture decisions must be performed prior to baseline selection, which helps to identify a vehicle that is most likely to meet the reliability and safety requirements when it reaches operations. Although methods exist for evaluating reliability and safety during early design, weaknesses exist when trying to evaluate all architecture effects simultaneously. The goal of this research was therefore to formulate and implement a method that is capable of quantitatively evaluating vehicle architecture effects on reliability and safety during early conceptual design. The ConcepTual Reliability Growth Approach for CompariSon of Launch Vehicle ArchiTectures (CONTRAST) was developed to meet this goal. Using the strengths of existing techniques a hybrid approach was developed, which utilizes a reliability growth projection to evaluate the vehicles. The growth models are first applied at the subsystem level and then a vehicle level projection is generated using a simple system level fault tree. This approach allows for the capture of all trades of interest at the subsystem level as well as many possible trades at the assembly level. The CONTRAST method is first tested on an example problem, which compares the method output to actual data from the Space Transportation System (STS). This example problem illustrates the ability of the CONTRAST method to capture reliability growth trends seen during vehicle operations. It also serves as a validation for the development of the reliability growth model assumptions for future applications of the method. The final chapter of the thesis applies the CONTRAST method to a relevant launch vehicle, the Space Launch System (SLS), which is currently under development. Within the application problem, the output of the method is first used to check that the primary research objective has been met. Next, the output is compared to a state-of-the-art tool in order to demonstrate the ability of the CONTRAST method to alleviate one of the primary consequences of using existing techniques. The final section within this chapter presents an analysis of the booster and upper stage block upgrade options for the SLS vehicle. A study of the upgrade options was carried out because the CONTRAST method is uniquely suited to look at the effects of such strategies. The results from the study of SLS block upgrades give interesting observations regarding the desired development order and upgrade strategy. Ultimately this application problem demonstrates the merits of applying the CONTRAST method during early design. This approach provides the designer with more information in regard to the expected reliability of the vehicle, which will ultimately enable the selection of a vehicle baseline that is most likely to meet the future requirements.
APA, Harvard, Vancouver, ISO, and other styles
39

Zhou, Steven. "A Novel Approach to Iris Localization and Code Matching for Iris Recognition." NSUWorks, 2009. http://nsuworks.nova.edu/gscis_etd/346.

Full text
Abstract:
In recent years, computing power and biometric sensors have not only become more powerful, but also more affordable to the general public. In turn, there has been great interest in developing and deploying biometric personal ID systems. Unlike the conventional security systems that often require people to provide artificial identification for verification, i.e. password or algorithmic generated keys, biometric security systems use an individual's biometric measurements, including fingerprint, face, hand geometry, and iris. It is believed that these measurements are unique to the individual, making them much more reliable and less likely to be stolen, lost, forgotten, or forged. Among these biometric measurements, the iris is regarded as one of the most reliable and accurate security approaches because it is an internal organ protected by the body's own biological mechanisms. It is easy to access, and almost impossible to modify without the risk of damaging the iris. Although there have been significant advancements in developing iris-based identification processes during recent years, there remains significant room for improvement. This dissertation presents a novel approach to the iris localization and code matching. It uses a fixed diameter method and a parabolic curve fitting approach for locating the iris and eyelids as well as a k-d tree for iris matching. The iris recognition rate is improved by accurately locating the eyelids and eliminating the signal noise in an eye image. Furthermore, the overall system performance is increased significantly by using a partial iris image and taking the advantage of the k-d binary tree. We present the research results of four processing stages of iris recognition: localization, normalization, feature extraction, and code matching. The localization process is based on histogram analysis, morphological process, Canny edge detection, and parabolic curve fitting. The normalization process adopts Daugman's rubber-sheet approach and converts the iris image from Cartesian coordinators to polar coordinates. In the feature extraction process, the feature vectors are created and quantized using 1-D Log-Gabor wavelet. Finally, the iris code matching process is conducted using a k-dimensional binary tree and Hamming distance.
APA, Harvard, Vancouver, ISO, and other styles
40

McKinley, Nathan D. "A Decision Theoretic Approach to Natural Language Generation." Case Western Reserve University School of Graduate Studies / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=case1386188714.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Mauricio-Sanchez, David, Andrade Lopes Alneu de, and higuihara Juarez Pedro Nelson. "Approaches based on tree-structures classifiers to protein fold prediction." Institute of Electrical and Electronics Engineers Inc, 2017. http://hdl.handle.net/10757/622536.

Full text
Abstract:
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado.
Protein fold recognition is an important task in the biological area. Different machine learning methods such as multiclass classifiers, one-vs-all and ensemble nested dichotomies were applied to this task and, in most of the cases, multiclass approaches were used. In this paper, we compare classifiers organized in tree structures to classify folds. We used a benchmark dataset containing 125 features to predict folds, comparing different supervised methods and achieving 54% of accuracy. An approach related to tree-structure of classifiers obtained better results in comparison with a hierarchical approach.
Revisión por pares
APA, Harvard, Vancouver, ISO, and other styles
42

Biondi, Franco, and Fares Qeadan. "A Theory-Driven Approach To Tree-Ring Standardization: Defining The Biological Trend From Expected Basal Area Increment." Tree-Ring Society, 2008. http://hdl.handle.net/10150/622585.

Full text
Abstract:
One of the main elements of dendrochronological standardization is removing the biological trend, i.e. the progressive decline of ring width along a cross-sectional radius that is caused by the corresponding increase in stem size and tree age over time. The ‘‘conservative’’ option for removing this biological trend is to fit a modified negative exponential curve (or a straight line with slope ≤ 0) to the ring-width measurements. This method is based on the assumption that, especially for open-grown and/or shade-intolerant species, annual growth rate of mature trees fluctuates around a specific level, expressed by a constant ring width. Because this method has numerical and conceptual drawbacks, we propose an alternative approach based on the assumption that constant growth is expressed by a constant basal area increment distributed over a growing surface. From this starting point, we derive a mathematical expression for the biological trend of ring width, which can be easily calculated and used for dendrochronological standardization. The proposed C-method is compared to other standardization techniques, including Regional Curve Standardization (RCS), of tree-ring width from ponderosa pines (Pinus ponderosa Douglas ex P.Lawson & C.Lawson) located at the Gus Pearson Natural Area (GPNA) in northern Arizona, USA. Master ring-index chronologies built from ring area, RCS, and C-method reproduced stand-wide patterns of tree growth at the GPNA, whereas other standardization options, including the ‘‘conservative’’ one, failed to do so. In addition, the C-method has the advantage of calculating an expected growth curve for each tree, whereas RCS is based on applying the same growth curve to all trees. In conclusion, the C-method replaces the purely empirical ‘‘conservative’’ option with a theory based approach, which is applicable to individual ring-width measurement series, does not require fitting a growth curve using nonlinear regression, and can be rigorously tested for improving tree-ring records of environmental changes.
APA, Harvard, Vancouver, ISO, and other styles
43

Holbrook, Kimberly Mae. "Seed dispersal limitation in a neotropical nutmeg, Virola flexuosa (Myristicaceae) an ecological and genetic approach /." Diss., St. Louis, Mo. : University of Missouri--St. Louis, 2006. http://etd.umsl.edu/r1741.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Koerner, Shannon D., Henri D. Grissino-Mayer, Lynne P. Sullivan, and Georgina G. DeWeese. "A Dendroarchaeological Approach To Mississippian Culture Occupational History In Eastern Tennessee, U.S.A." Tree-Ring Society, 2009. http://hdl.handle.net/10150/622592.

Full text
Abstract:
We investigated the potential for using long-archived wood samples extracted from archaeological contexts at four Mississippian Period (AD 900–1600) settlements in eastern Tennessee for tree-ring dating purposes. Sixteen wood samples recovered from prehistoric sites were analyzed to: (1) crossmatch samples from each site with the intent of determining the relative chronological order of sites, (2) establish a floating prehistoric tree-ring chronology for eastern Tennessee, (3) determine the applicability of dendrochronology in prehistoric archaeology in eastern Tennessee, and (4) establish a strategy for future research in the region. We succeeded in crossmatching only three of the 16 tree-ring sequences against each other, representing two sites relatively close to each other: Upper Hampton and Watts Bar Reservoir. The average interseries correlation of these three samples was 0.74 with an average mean sensitivity of 0.26, and they were used to create a 131-year-long floating chronology. The remaining samples contained too few rings (15 to 43) for conclusive crossmatching. Our results demonstrate that dendrochronological techniques may be applied to the practice of prehistoric archaeology in the Southeastern U.S., but highlight the challenges that face dendroarchaeologists: (1) poor wood preservation at prehistoric sites, (2) too few rings in many samples, (3) the lack of a reference chronology long enough for absolute dating, and (4) the lack of a standard on-site sampling protocol to ensure the fragile wood samples remain intact.
APA, Harvard, Vancouver, ISO, and other styles
45

Murtha, Justin Fortna. "An Evidence Theoretic Approach to Design of Reliable Low-Cost UAVs." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/33762.

Full text
Abstract:
Small unmanned aerial vehicles (SUAVs) are plagued by alarmingly high failure rates. Because these systems are small and built at lower cost than full-scale aircraft, high quality components and redundant systems are often eschewed to keep production costs low. This thesis proposes a process to ``design in'' reliability in a cost-effective way. Fault Tree Analysis is used to evaluate a system's (un)reliability and Dempster-Shafer Theory (Evidence Theory) is used to deal with imprecise failure data. Three unique sensitivity analyses highlight the most cost-effective improvement for the system by either spending money to research a component and reduce uncertainty, swap a component for a higher quality alternative, or add redundancy to an existing component. A MATLAB$^{\circledR}$ toolbox has been developed to assist in practical design applications. Finally, a case study illustrates the proposed methods by improving the reliability of a new SUAV design: Virginia Tech's SPAARO UAV.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
46

Avram, Florin, and Dimitris J. Bertsimas. "The Minimum Spanning Tree Constant in Geometrical Probability and Under the Independent Model; A Unified Approach." Massachusetts Institute of Technology, Operations Research Center, 1990. http://hdl.handle.net/1721.1/5189.

Full text
Abstract:
Given n uniformly and independently points in the d dimensional cube of unit volume, it is well established that the length of the minimum spanning tree on these n points is asymptotic to /3MsT(d)n(d-l)/d,where the constant PMST(d) depends only on the dimension d. It has been a major open problem to determine the constant 3MST(d). In this paper we obtain an exact expression of the constant MST(d) as a series expansion. Truncating the expansion after a finite number of terms yields a sequence of lower bounds; the first 3 terms give a lower bound which is already very close to the empirically estimated value of the constant. Our proof technique unifies the derivation for the MST asymptotic behavior for the Euclidean and the independent model.
APA, Harvard, Vancouver, ISO, and other styles
47

Lee, Justin Lance. "Participation and pressure in the Mist Kingdom of Sumba : a local NGO's approach to tree-planting /." Title page, contents and abstract only, 1995. http://web4.library.adelaide.edu.au/theses/09PH/09phl4781.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Mudunuru, Venkateswara Rao. "Modeling and Survival Analysis of Breast Cancer: A Statistical, Artificial Neural Network, and Decision Tree Approach." Scholar Commons, 2016. http://scholarcommons.usf.edu/etd/6120.

Full text
Abstract:
Survival analysis today is widely implemented in the fields of medical and biological sciences, social sciences, econometrics, and engineering. The basic principle behind the survival analysis implies to a statistical approach designed to take into account the amount of time utilized for a study period, or the study of time between entry into observation and a subsequent event. The event of interest pertains to death and the analysis consists of following the subject until death. Events or outcomes are defined by a transition from one discrete state to another at an instantaneous moment in time. In the recent years, research in the area of survival analysis has increased greatly because of its large usage in areas related to bio sciences and the pharmaceutical studies. After identifying the probability density function that best characterizes the tumors and survival times of breast cancer women, one purpose of this research is to compare the efficiency between competing estimators of the survival function. Our study includes evaluation of parametric, semi-parametric and nonparametric analysis of probability survival models. Artificial Neural Networks (ANNs), recently applied to a number of clinical, business, forecasting, time series prediction, and other applications, are computational systems consisting of artificial neurons called nodes arranged in different layers with interconnecting links. The main interest in neural networks comes from their ability to approximate complex nonlinear functions. Among the available wide range of neural networks, most research is concentrated around feed forward neural networks called Multi-layer perceptrons (MLPs). One of the important components of an artificial neural network (ANN) is the activation function. This work discusses properties of activation functions in multilayer neural networks applied to breast cancer stage classification. There are a number of common activation functions in use with ANNs. The main objective in this work is to compare and analyze the performance of MLPs which has back-propagation algorithm using various activation functions for the neurons of hidden and output layers to evaluate their performance on the stage classification of breast cancer data. Survival analysis can be considered a classification problem in which the application of machine-learning methods is appropriate. By establishing meaningful intervals of time according to a particular situation, survival analysis can easily be seen as a classification problem. Survival analysis methods deals with waiting time, i.e. time till occurrence of an event. Commonly used method to classify this sort of data is logistic regression. Sometimes, the underlying assumptions of the model are not true. In model building, choosing an appropriate model depends on complexity and the characteristics of the data that affect the appropriateness of the model. Two such strategies, which are used nowadays frequently, are artificial neural network (ANN) and decision trees (DT), which needs a minimal assumption. DT and ANNs are widely used methodological tools based on nonlinear models. They provide a better prediction and classification results than the traditional methodologies such as logistic regression. This study aimed to compare predictions of the ANN, DT and logistic models by breast cancer survival. In this work our goal is to design models using both artificial neural networks and logistic regression that can precisely predict the output (survival) of breast cancer patients. Finally we compare the performances of these models using receiver operating characteristic (ROC) analysis.
APA, Harvard, Vancouver, ISO, and other styles
49

"A constrained steiner tree approach for reconstructions of multicast trees." 2004. http://library.cuhk.edu.hk/record=b5891874.

Full text
Abstract:
Sun Tong.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2004.
Includes bibliographical references (leaves 77-81).
Abstracts in English and Chinese.
Chinese Abstract --- p.I
Abstract --- p.II
Acknowledgements --- p.III
List of Contents --- p.IV
List of Figures --- p.VII
Chapter Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Multicast Routing Problem --- p.1
Chapter 1.2 --- Constrained multicast routing problem and SSRA algorithm --- p.4
Chapter 1.3 --- Thesis organization --- p.7
Chapter Chapter 2 --- Constrained Multicast Routing Algorithms --- p.8
Chapter 2.1 --- Steiner tree heuristic --- p.8
Chapter 2.1.1 --- Shortest Paths Heuristic --- p.9
Chapter 2.1.2 --- Distance Network Heuristic --- p.10
Chapter 2.2 --- Review of existing constrained multicast routing algorithms --- p.10
Chapter 2.2.1 --- Static group member --- p.10
Chapter 2.2.2 --- Dynamic group member --- p.14
Chapter 2.2.2.1 --- Non-rearrangeable --- p.15
Chapter 2.2.2.2 --- Rearrangeable --- p.23
Chapter Chapter 3 --- Small Scale Rearrangement Algorithm for Multicast Routing --- p.32
Chapter 3.1 --- Problem formulation --- p.32
Chapter 3.1.1 --- Network Model --- p.32
Chapter 3.1.2 --- Problem Specification --- p.33
Chapter 3.1.3 --- Definitions and Notations --- p.36
Chapter 3.2 --- Local Checking Scheme (LCS) --- p.37
Chapter 3.3 --- Small Scale Rearrangement Algorithm (SSRA) for Multicast Routing --- p.41
Chapter 3.3.1 --- Static group membership --- p.42
Chapter 3.3.2 --- Dynamic group membership --- p.43
Chapter 3.3.2.1 --- Node addition --- p.44
Chapter 3.3.2.2 --- Node removal --- p.44
Chapter 3.3.2.3 --- General steps --- p.45
Chapter 3.3.2.4 --- Example --- p.47
Chapter Chapter 4 --- Analysis --- p.50
Chapter Chapter 5 --- Simulations --- p.54
Chapter 5.1 --- Simulation Model --- p.54
Chapter 5.2 --- Simulation Parameters Parameter Default Value/Generating Method --- p.56
Chapter 5.3 --- Performance Metrics --- p.58
Chapter 5.4 --- Discussion of Results --- p.59
Chapter 5.4.1 --- Group 1: static group membership --- p.59
Chapter 5.4.2 --- Group 2: dynamic group membership --- p.63
Chapter 5.4.3 --- Comparison --- p.69
Chapter 5.5 --- Implementation Issue --- p.73
Chapter Chapter 6 --- Conclusion --- p.75
Reference --- p.77
APA, Harvard, Vancouver, ISO, and other styles
50

Kabir, Sohag, K. Aslansefat, I. Sorokos, Y. Papadopoulos, and Savas Konur. "A hybrid modular approach for dynamic fault tree analysis." 2020. http://hdl.handle.net/10454/17983.

Full text
Abstract:
Yes
Over the years, several approaches have been developed for the quantitative analysis of dynamic fault trees (DFTs). These approaches have strong theoretical and mathematical foundations; however, they appear to suffer from the state-space explosion and high computational requirements, compromising their efficacy. Modularisation techniques have been developed to address these issues by identifying and quantifying static and dynamic modules of the fault tree separately by using binary decision diagrams and Markov models. Although these approaches appear effective in reducing computational effort and avoiding state-space explosion, the reliance of the Markov chain on exponentially distributed data of system components can limit their widespread industrial applications. In this paper, we propose a hybrid modularisation scheme where independent sub-trees of a DFT are identified and quantified in a hierarchical order. A hybrid framework with the combination of algebraic solution, Petri Nets, and Monte Carlo simulation is used to increase the efficiency of the solution. The proposed approach uses the advantages of each existing approach in the right place (independent module). We have experimented the proposed approach on five independent hypothetical and industrial examples in which the experiments show the capabilities of the proposed approach facing repeated basic events and non-exponential failure distributions. The proposed approach could provide an approximate solution to DFTs without unacceptable loss of accuracy. Moreover, the use of modularised or hierarchical Petri nets makes this approach more generally applicable by allowing quantitative evaluation of DFTs with a wide range of failure rate distributions for basic events of the tree.
This work was supported in part by the Dependability Engineering Innovation for Cyber Physical Systems (CPS) (DEIS) H2020 Project under Grant 732242, and in part by the LIVEBIO: Light-weight Verification for Synthetic Biology Project under Grant EPSRC EP/R043787/1.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography