To see the other types of publications on this topic, follow the link: True costs.

Dissertations / Theses on the topic 'True costs'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'True costs.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Becker, Udo J., Thilo Becker, and Julia Gerlach. "The True Costs of Automobility: External Costs of Cars Overview on existing estimates in EU-27." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-216656.

Full text
Abstract:
Mobilität und Verkehr, insbesondere der Verkehr mit PKW, sind unverzichtbare Bestandteile des Lebens. PKW erbringen ohne Zweifel große private Nutzen. Daneben erzeugen PKW aber auch sog. externe Effekte: Kosten, die die Fahrenden eigentlich unbeteiligten Dritten aufbürden. Dies sind Kosten durch Lärm und Abgase, durch ungedeckte Unfallkosten, durch die entstehenden Klimaschäden, durch "up- and downstream" - Effekte und durch sonstige Umweltkosten. Diese Kosten werden nicht vom Nutzer getragen, sondern von anderen Menschen, anderen Ländern und anderen Generationen. Letztlich führt dies dazu, dass die Kosten der Autonutzung in der EU-27 zu niedrig sind. Damit erge-ben sich ökonomisch zwingend ineffiziente Wahlentscheidungen: PKW werden häufiger genutzt als bei effizienten Allokationen. Auch deshalb gibt es in unseren Städten so viel Stau, so hohe Abgas- und Lärmemissionen, so hohe Steuern für die Kompensation der Schäden und so hohe Krankenkassenbeiträge. In der Untersuchung wurden, basierend auf allen vorliegenden Studien sowie auf der Methodik des „handbooks“ der EU-Kommission ("IMPACT") Schätzungen für die ungedeckten Kosten der 27 Mitgliedsländer der EU erarbeitet. Kostensätze für Lärm, Luftverschmutzung, Unfälle und up-/downstream Effekte wurden analog zu den Studien von CE Delft, Infras and Fraunhofer (2011) übernommen. Die Methodik ist im Text beschrieben und entspricht dem Stand des Wissens; im Bereich der Klimaschäden wurde ein höherer Ansatz und eine Bandbreite (low scenario: 72 €/t CO2; high scenario: 252 €/t CO2) gewählt.
APA, Harvard, Vancouver, ISO, and other styles
2

Becker, Udo J., Thilo Becker, and Julia Gerlach. "The True Costs of Automobility: External Costs of Cars Overview on existing estimates in EU-27." Technische Universität Dresden, 2012. https://tud.qucosa.de/id/qucosa%3A30084.

Full text
Abstract:
Mobilität und Verkehr, insbesondere der Verkehr mit PKW, sind unverzichtbare Bestandteile des Lebens. PKW erbringen ohne Zweifel große private Nutzen. Daneben erzeugen PKW aber auch sog. externe Effekte: Kosten, die die Fahrenden eigentlich unbeteiligten Dritten aufbürden. Dies sind Kosten durch Lärm und Abgase, durch ungedeckte Unfallkosten, durch die entstehenden Klimaschäden, durch 'up- and downstream' - Effekte und durch sonstige Umweltkosten. Diese Kosten werden nicht vom Nutzer getragen, sondern von anderen Menschen, anderen Ländern und anderen Generationen. Letztlich führt dies dazu, dass die Kosten der Autonutzung in der EU-27 zu niedrig sind. Damit erge-ben sich ökonomisch zwingend ineffiziente Wahlentscheidungen: PKW werden häufiger genutzt als bei effizienten Allokationen. Auch deshalb gibt es in unseren Städten so viel Stau, so hohe Abgas- und Lärmemissionen, so hohe Steuern für die Kompensation der Schäden und so hohe Krankenkassenbeiträge. In der Untersuchung wurden, basierend auf allen vorliegenden Studien sowie auf der Methodik des „handbooks“ der EU-Kommission ('IMPACT') Schätzungen für die ungedeckten Kosten der 27 Mitgliedsländer der EU erarbeitet. Kostensätze für Lärm, Luftverschmutzung, Unfälle und up-/downstream Effekte wurden analog zu den Studien von CE Delft, Infras and Fraunhofer (2011) übernommen. Die Methodik ist im Text beschrieben und entspricht dem Stand des Wissens; im Bereich der Klimaschäden wurde ein höherer Ansatz und eine Bandbreite (low scenario: 72 €/t CO2; high scenario: 252 €/t CO2) gewählt.
APA, Harvard, Vancouver, ISO, and other styles
3

Forslind, Maja. "Finding the Dollar Language : Drivers and rationales for monetising corporate environmental and social impacts– practices in counting the true value of business operation from ecosystem services perspective." Thesis, Stockholms universitet, Stockholm Resilience Centre, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-85855.

Full text
Abstract:
The thesis explores how monetisation of corporate externalities, can be carried out in order to provide investors, policy makers and consumers with accurate pictures of the true costs and benefits of business operations from a resilience and ecosystem services perspective. By drawing conclusions from company cases, and previous research – methods, drivers and monetary values of impacts such as carbon dioxide, water usage, pollutants and land use are analysed. The findings reflect opportunities that open up with monetisation, in terms of tools for guidance and support in internal corporate decision making, by making the actual impacts visualised and understandable. Findings from company cases, show that monetisation of corporate effects has potential to contribute to visualising impacts – and add knowledge that may close information gaps internally as well as externally. It can guide and facilitate strategic choices at corporate level. It may also have a role in bridging information asymmetries in the picture of a firm’s operation, to consumers and investors. Monetising effects may facilitate identification of risks arising from ecosystem services dependencies, visualising the actual impacts by, assed costs in losses in ecosystems’ production (yields e.g.) caused by corporate harm.Providing relevant information to policy makers, on obstacles and where regulative incentives are needed, and investors and consumers with guidance, monetisation of impacts potentially can play a part in bridging market information gaps toward better incentive structures and possibly facilitating effective market transformation in favor of sustainable production and consumption patterns.
APA, Harvard, Vancouver, ISO, and other styles
4

Nesbitt, Tess Alexandra. "Cost-sensitive tree-stacking learning with variable prediction error costs /." Diss., Restricted to subscribing institutions, 2010. http://proquest.umi.com/pqdweb?did=2026906691&sid=1&Fmt=2&clientId=1564&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sandmann, F. G. "The true cost of epidemic and outbreak diseases in hospitals." Thesis, London School of Hygiene and Tropical Medicine (University of London), 2018. http://researchonline.lshtm.ac.uk/4648208/.

Full text
Abstract:
Background: Outbreaks of infectious diseases may result in bed pressures, which are often mitigated by delaying new admissions due to beds being unavailable. This painfully illustrates to policy makers and the public what is meant by the economic notion of “opportunity costs”: The value of the next-best alternative forgone, or in this situation: The value of the beds for the displaced patients. These opportunity costs need to be captured adequately in economic analyses. Methods: Suitable approaches for estimating the opportunity costs of healthcare beds from the perspective of health-maximising decision makers were searched for in a literature review. Lack of adequate methods drove the development of a novel approach. Differences among approaches were explored using, as a case study, hospitalisations for norovirus-associated gastroenteritis. Its hospital burden was quantified nationally for England using statistical modelling. Afterwards, a stochastic mathematical model of hospital wards was built to explore the additional bed pressures on occupancy levels due to transmission-dynamic norovirus outbreaks. Results: Health-maximising decision makers should approximate the opportunity costs of healthcare beds by considering the net benefit of the second-best admissions forgone. This novel approach estimated a loss of 6,300 quality-adjusted life years (QALYs) annually in England and economic costs of £190−£298 million due to norovirus, roughly 2−3 times higher than the financial expenditures incurred of £108 million. During norovirus outbreaks, additional bed pressures arise 83.0% of the time, preventing a mean of 6.8 (range 0−44) new admissions that could have been admitted had there been no outbreak. Conclusions: Owing to market imperfections, the true value of healthcare beds differs from the value calculated using pragmatic conventions. In this thesis, these opportunity costs were estimated for the first time by explicitly including the wider health impact for other patients awaiting admission. The higher values obtained may impact the outcome of economic analyses.
APA, Harvard, Vancouver, ISO, and other styles
6

Empett, Brian Wilfred. "Towards a true cost of security for an electrical supply system." Thesis, Imperial College London, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.248403.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kassim, M. E. "Elliptical cost-sensitive decision tree algorithm (ECSDT)." Thesis, University of Salford, 2018. http://usir.salford.ac.uk/47191/.

Full text
Abstract:
Cost-sensitive multiclass classification problems, in which the task of assessing the impact of the costs associated with different misclassification errors, continues to be one of the major challenging areas for data mining and machine learning. The literature reviews in this area show that most of the cost-sensitive algorithms that have been developed during the last decade were developed to solve binary classification problems where an example from the dataset will be classified into only one of two available classes. Much of the research on cost-sensitive learning has focused on inducing decision trees, which are one of the most common and widely used classification methods, due to the simplicity of constructing them, their transparency and comprehensibility. A review of the literature shows that inducing nonlinear multiclass cost-sensitive decision trees is still in its early stages and further research could result in improvements over the current state of the art. Hence, this research aims to address the following question: 'How can non-linear regions be identified for multiclass problems and utilized to construct decision trees so as to maximize the accuracy of classification, and minimize misclassification costs?' This research addresses this problem by developing a new algorithm called the Elliptical Cost-Sensitive Decision Tree algorithm (ECSDT) that induces cost-sensitive non-linear (elliptical) decision trees for multiclass classification problems using evolutionary optimization methods such as particle swarm optimization (PSO) and Genetic Algorithms (GAs). In this research, ellipses are used as non-linear separators, because of their simplicity and flexibility in drawing non-linear boundaries by modifying and adjusting their size, location and rotation towards achieving optimal results. The new algorithm was developed, tested, and evaluated in three different settings, each with a different objective function. The first considered maximizing the accuracy of classification only; the second focused on minimizing misclassification costs only, while the third considered both accuracy and misclassification cost together. ECSDT was applied to fourteen different binary-class and multiclass data sets and the results have been compared with those obtained by applying some common algorithms from Weka to the same datasets such as J48, NBTree, MetaCost, and the CostSensitiveClassifier. The primary contribution of this research is the development of a new algorithm that shows the benefits of utilizing elliptical boundaries for cost-sensitive decision tree learning. The new algorithm is capable of handling multiclass problems and an empirical evaluation shows good results. More specifically, when considering accuracy only, ECSDT performs better in terms of maximizing accuracy on 10 out of the 14 datasets, and when considering minimizing misclassification costs only, ECSDT performs better on 10 out of the 14 datasets, while when considering both accuracy and misclassification costs, ECSDT was able to obtain higher accuracy on 10 out of the 14 datasets and minimize misclassification costs on 5 out of the 14 datasets. The ECSDT also was able to produce smaller trees when compared with J48, LADTree and ADTree.
APA, Harvard, Vancouver, ISO, and other styles
8

Foreman, James Sterling. "Reducing transportation costs and inventory shrinkage in the Washington State tree fruit industry." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/53050.

Full text
Abstract:
Thesis (M. Eng. in Logistics)--Massachusetts Institute of Technology, Engineering Systems Division, 2009.
Includes bibliographical references (leaves 91-95).
Perishability and stock-outs are two sources of inventory inefficiency in the Washington State tree fruit industry. This thesis measures the size of these inefficiencies in terms of dollars per box, and describes five solutions, four qualitative and one quantitative, that seek to address them. To establish the magnitude of the inefficiencies, I regress various fruit characteristics on a set of sales data, thereby ascertaining the relationship between a fruit's price and its age. I find that the industry loses 5% to 12% of potential revenue due to perishability and propose four qualitative policies designed to reduce these losses. Next, I develop an operational management tool in the form of a mixed-integer optimization model which can be used to make optimal sourcing decisions during stock-out events. I find that the potential savings from improved sourcing decisions are between $0.01 and 0.02 per box. These results confirm that the costs and foregone revenue associated with inventory management are significant and merit the tree fruit industry's attention.
by Jame Sterling Foreman.
M.Eng.in Logistics
APA, Harvard, Vancouver, ISO, and other styles
9

MARQUES, DANIEL DOS SANTOS. "A DECISION TREE LEARNER FOR COST-SENSITIVE BINARY CLASSIFICATION." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2016. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=28239@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
Problemas de classificação foram amplamente estudados na literatura de aprendizado de máquina, gerando aplicações em diversas áreas. No entanto, em diversos cenários, custos por erro de classificação podem variar bastante, o que motiva o estudo de técnicas de classificação sensível ao custo. Nesse trabalho, discutimos o uso de árvores de decisão para o problema mais geral de Aprendizado Sensível ao Custo do Exemplo (ASCE), onde os custos dos erros de classificação variam com o exemplo. Uma das grandes vantagens das árvores de decisão é que são fáceis de interpretar, o que é uma propriedade altamente desejável em diversas aplicações. Propomos um novo método de seleção de atributos para construir árvores de decisão para o problema ASCE e discutimos como este pode ser implementado de forma eficiente. Por fim, comparamos o nosso método com dois outros algoritmos de árvore de decisão propostos recentemente na literatura, em 3 bases de dados públicas.
Classification problems have been widely studied in the machine learning literature, generating applications in several areas. However, in a number of scenarios, misclassification costs can vary substantially, which motivates the study of Cost-Sensitive Learning techniques. In the present work, we discuss the use of decision trees on the more general Example-Dependent Cost-Sensitive Problem (EDCSP), where misclassification costs vary with each example. One of the main advantages of decision trees is that they are easy to interpret, which is a highly desirable property in a number of applications. We propose a new attribute selection method for constructing decision trees for the EDCSP and discuss how it can be efficiently implemented. Finally, we compare our new method with two other decision tree algorithms recently proposed in the literature, in 3 publicly available datasets.
APA, Harvard, Vancouver, ISO, and other styles
10

Roberts, Jhanneu. "The True Cost of Our Entertainment: An Inside Look to Modern Method Acting and its Consequences." Scholarship @ Claremont, 2016. http://scholarship.claremont.edu/cmc_theses/1322.

Full text
Abstract:
This goal of this thesis is to examine the physical and emotional cost associated with modern and personal interpretations of Strasberg-based method acting. Although many method actors have created excellent award-winning performances, many were left with emotional and physical harm to their body. In this thesis I will argue that there are actors that can deal with the after effects however, the risks associated with Strasberg-based method often pose both mental and physical health risks to the actor that outweigh the benefit they contribute to the production. To understand what Strasberg-based method acting is I will examine the practices of Stanislavski, the founder of the original “method,” and teacher and actors Stella Adler and Sanford Meisner, and their methods to creating a character. Strasberg, Adler, and Meisner, who are believed to have created their methods based off the Stanislavski Acting System, had many disagreements about Stanislavski’s method. What many now call method acting, incorporates certain techniques created by Stanislavski that actors then use to create their own method.
APA, Harvard, Vancouver, ISO, and other styles
11

Lomax, S. E. "Cost-sensitive decision tree learning using a multi-armed bandit framework." Thesis, University of Salford, 2013. http://usir.salford.ac.uk/29308/.

Full text
Abstract:
Decision tree learning is one of the main methods of learning from data. It has been applied to a variety of different domains over the past three decades. In the real world, accuracy is not enough; there are costs involved, those of obtaining the data and those when classification errors occur. A comprehensive survey of cost-sensitive decision tree learning has identified over 50 algorithms, developing a taxonomy in order to classify the algorithms by the way in which cost has been incorporated, and a recent comparison shows that many cost-sensitive algorithms can process balanced, two class datasets well, but produce lower accuracy rates in order to achieve lower costs when the dataset is less balanced or has multiple classes. This thesis develops a new framework and algorithm concentrating on the view that cost-sensitive decision tree learning involves a trade-off between costs and accuracy. Decisions arising from these two viewpoints can often be incompatible resulting in the reduction of the accuracy rates. The new framework builds on a specific Game Theory problem known as the multi-armed bandit. This problem concerns a scenario whereby exploration and exploitation are required to solve it. For example, a player in a casino has to decide which slot machine (bandit) from a selection of slot machines is likely to pay out the most. Game Theory proposes a solution of this problem which is solved by a process of exploration and exploitation in which reward is maximized. This thesis utilizes these concepts from the multi-armed bandit game to develop a new algorithm by viewing the rewards as a reduction in costs, utilizing the exploration and exploitation techniques so that a compromise between decisions based on accuracy and decisions based on costs can be found. The algorithm employs the adapted multi-armed bandit game to select the attributes during decision tree induction, using a look-ahead methodology to explore potential attributes and exploit the attributes which maximizes the reward. The new algorithm is evaluated on fifteen datasets and compared to six well-known algorithms J48, EG2, MetaCost, AdaCostM1, ICET and ACT. The results obtained show that the new multi-armed based algorithm can produce more cost-effective trees without compromising accuracy. The thesis also includes a critical appraisal of the limitations of the developed algorithm and proposes avenues for further research.
APA, Harvard, Vancouver, ISO, and other styles
12

Bergström, Joakim, and Hampus Nilsson-Sundén. "Cost effective optimization of system safety and reliability." Thesis, Linköpings universitet, Fysik och elektroteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-119950.

Full text
Abstract:
A method able to analyze and optimize subsystems could be useful to reduce project cost, increase subsystem reliability, improve overall aircraft safety and reduce subsystem weight. The earlier the optimization of development of an aircraft in the design phase can be performed, the better the yield of the optimization becomes. This master thesis was formed in order to construct an automatic analysis method, implementing a Matlab script, evaluating devices forming aircraft subsystems using a Genetic Algorithm. In addition to aircraft subsystems, the method constructed in the work is compatible with systems of various industries with minor modifications of the script.
APA, Harvard, Vancouver, ISO, and other styles
13

Stefanescu, Carla. "Cost-benefit theory of leaf lifespan using seedlings of tropical tree species." [Gainesville, Fla.] : University of Florida, 2006. http://purl.fcla.edu/fcla/etd/UFE0015764.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Holgén, Per. "Seedling performance, shelter tree increment and recreation values in boreal shelterwood stands /." Umeå : Swedish Univ. of Agricultural Sciences (Sveriges lantbruksuniv.), 1999. http://epsilon.slu.se/avh/1999/91-576-5854-4.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Murtha, Justin Fortna. "An Evidence Theoretic Approach to Design of Reliable Low-Cost UAVs." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/33762.

Full text
Abstract:
Small unmanned aerial vehicles (SUAVs) are plagued by alarmingly high failure rates. Because these systems are small and built at lower cost than full-scale aircraft, high quality components and redundant systems are often eschewed to keep production costs low. This thesis proposes a process to ``design in'' reliability in a cost-effective way. Fault Tree Analysis is used to evaluate a system's (un)reliability and Dempster-Shafer Theory (Evidence Theory) is used to deal with imprecise failure data. Three unique sensitivity analyses highlight the most cost-effective improvement for the system by either spending money to research a component and reduce uncertainty, swap a component for a higher quality alternative, or add redundancy to an existing component. A MATLAB$^{\circledR}$ toolbox has been developed to assist in practical design applications. Finally, a case study illustrates the proposed methods by improving the reliability of a new SUAV design: Virginia Tech's SPAARO UAV.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
16

Anchukaitis, Kevin John. "A Stable Isotope Approach to Neotropical Cloud Forest Paleoclimatology." Diss., The University of Arizona, 2007. http://hdl.handle.net/10150/195637.

Full text
Abstract:
Many tropical trees do not form reliable annual growth rings, making it a challenge to develop tree-ring width chronologies for application to paleoclimatology in these regions. Here, I seek to establish high-resolution proxy climate records from trees without rings from the Monteverde Cloud Forest in Costa Rica using stable isotope dendroclimatology. Neotropical cloud forest ecosystems are associated with a relatively narrow range of geographic and hydroclimatic conditions, and are potentially sensitive to climate variability and change at time scales from annual to centennial and longer. My approach takes advantage of seasonal changes in the d18O of water sources used by trees over a year, a signature that is imparted to the radial growth and provides the necessary chronological control. A rapid wood extraction technique is evaluated and found to produce cellulose with d18O values indistinguishable from conventional approaches, although its application to radiocarbon requires a statistical correction. Analyses of plantation-grown Ocotea tenera reveal coherent annual d18O cycles up to 9 permil. The width of these cycles corresponds to observed basal growth increments. Interannual variability in d18O at this site is correlated with wet season precipitation anomalies. At higher elevations within the orographic cloud bank, year-to-year changes in the amplitude of oxygen isotope cycles show a relationship with dry season climate. Longer d18O chronologies from mature Pouteria (Sapotacae) reveal that dry season hydroclimatology is controlled at interannual time scales by variability in the eastern equatorial Pacific (ENSO) and the Western Hemisphere Warm Pool (WHWP), which are correlated with trade wind strength and local air temperature. A change in the late 1960s toward enhanced annual d18O amplitude may reflect low frequency changes in the Atlantic and Pacific ocean-atmosphere system. This study establishes the basis for cloud forest isotope dendroclimatology and demonstrates that the local climate of neotropical cloud forests is sensitive to interannual, and perhaps, multidecadal changes in important large-scale modes of climate variability.
APA, Harvard, Vancouver, ISO, and other styles
17

Lindblom, Agnes. "Detaljplaneprocessens tidsåtgång:En djupanalys av två detaljplaner i tre kommuner." Thesis, KTH, Urbana och regionala studier, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188898.

Full text
Abstract:
Bakgrunden till detta examensarbete har sin grund i den bostadsbrist som finns i Sverige idag och den allmänna debatten om ökat bostadsbyggande och långsamma planprocesser som pågår. I studien har två detaljplaner, med huvudändamål bostäder, i var och en av de tre kommunerna Botkyrka, Norrköping och Västerås djupgående analyserats, i syfte att bringa klarhet i vilka aktiviteter och komponenter som faktiskt ger upphov till tidsåtgång i detaljplaneprocessens olika skeden. Av studien, vars tillvägagångssätt baserats på dokumentanalyser och intervjuer, framgår att detaljplaneprocessen utifrån sin utformning samt uppbyggnad via plan- och bygglagen (PBL) tar ”tid”. Hur lång tid en process tar beror enligt studien på flera olika anledningar som kan ha med den specifika planen och/eller kommunen att göra. Resultatet av undersökningen visar på att det i ett tidigt skede är väldigt svårt att uppskatta tidsåtgången för en detaljplan, då alla planer till viss del är unika och då oförutsebara komponenter och aktiviteter vanligtvis dyker upp under processens gång. Tidsåtgången för detaljplaneprocessen, räknat från det att ett formellt planuppdrag ges till det att detaljplanen vinner laga kraft, har för de planer som studerats i arbetet kraftigt varierat mellan 21 månader till 7 år. I snitt har dock processen för samtliga planer endast aktivt bedrivits under 1,5 – 3 år. Anledningen till att arbetet med vissa detaljplaner avstannat eller fördröjts har bland annat med oerfarna byggherrars brist på kunskap, politisk oenighet, sena inspel och bristen på resurser att göra. Enligt studien är de aktiviteter som ger upphov till störst tidsåtgång i detaljplaneprocesserna framförallt kopplade till utredningsarbetet. Det handlar både om att det är många utredningar som behöver tas fram och att de tar tid. Av studien framgår att tidseffektiviseringar i planprocessen är önskvärt så länge kvalitén kan bibehållas och detaljplanernas genomförbarhet kan säkerställas. Bostadsbristen i regionerna och det höga trycket i kommunerna lyfts fram som aspekter till varför det är viktigt. De förslag till förbättringsåtgärder som framhålls i denna studie handlar bland annat om att få till stånd bättre former för tidig samverkan mellan involverade parter, att ge den strategiska planeringen större tyngd och att utreda mer i ett tidigt skede, att skapa tydliga kommunövergripande mål och prioriteringsordningar för detaljplaner samt förbättra kommunens system och arbetsrutiner. Utöver det omnämns även ett behov av att utöka kommunens resurser och att öka tydligheten i vilka beslut som behöver fattas politiskt och på vilken nivå samt om det finns beslut som kan delegeras till kommunernas förvaltningar.
The background of this master thesis stems from the housing shortage that exists in Sweden today and the general debate on increased housing and slow planning processes that are currently taking place. In this study an in-depth analysis of two detailed development plans in each of the three municipalities of Botkyrka, Norrköping and Västerås has been carried out, in order to shed light on which activities and components that give rise to time consumption in the various stages of the planning process. The study, which is based on document analysis and interviews, concludes that the Swedish planning process due to its overall design and by being regulated by the Swedish planning and building act (PBL) takes "time". How much time the process takes depends, according to the analysis, on several different reasons that may have with the specific detailed development plan and/or the municipality to do. The result of the case studies carried out in this master thesis shows that it’s very difficult in an early stage to estimate how much time that will be required in order to create a detailed development plan. The reason for this is that every plan is somewhat unique and that unforeseeable components and activities usually emerge during the process. The time-cost of the planning process for the detailed development plans studied in this master thesis varied greatly from 21 months to 7 years. On average, however, the process for all the plans was only actively pursued during 1,5 - 3 years. The reason that the development of some of the plans was delayed or postponed has to do with, among other things, inexperienced developers lack of knowledge with regards to the planning process, political disagreements, belated demands for additional investigations and shortage of staff. According to this study, conducting surveys and investigations are the most time consuming activities within the planning process. It is a combination of many surveys needed to be carried out and that each one of them takes time to complete.    The result of this study shows that more time-efficient planning processes are desirable as long as the quality can be maintained and the feasibility of the detailed development plans can be guaranteed. The housing shortage in the regions and the high pressure to draw up plans and build homes in the municipalities are brought up as aspects of why it is important. The improvements that are suggested in the study involves, among others things, establishing better forms of early collaboration between all parties involved, giving the strategic planning greater weight and conducting more surveys in an early stage, creating clear common goals and priorities within the municipality's and furthermore improve the municipality's systems and working practices. In addition, it is suggested that the municipality's resources should be extended and that there should be an increased transparency regarding which decisions in the planning processes that need to be made politically and at what level, and if there are decisions that can be delegated to the municipality’s administrations.
APA, Harvard, Vancouver, ISO, and other styles
18

Keyder, Emil Ragip. "New Heuristics for Planning with Action Costs." Doctoral thesis, Universitat Pompeu Fabra, 2010. http://hdl.handle.net/10803/7570.

Full text
Abstract:
Classical planning is the problem of nding a sequence of actions that take an agent from an initial state to a desired goal situation, assuming deter- ministic outcomes for actions and perfect information. Satis cing planning seeks to quickly nd low-cost solutions with no guarantees of optimality. The most e ective approach for satis cing planning has proved to be heuristic search using non-admissible heuristics. In this thesis, we introduce several such heuristics that are able to take into account costs on actions, and there- fore try to minimize the more general metric of cost, rather than length, of plans, and investigate their properties and performance. In addition, we show how the problem of planning with soft goals can be compiled into a classical planning problem with costs, a setting in which cost-sensitive heuristics such as those presented here are essential.
La plani caci on cl asica es el problema que consiste en hallar una secuencia de acciones que lleven a un agente desde un estado inicial a un objetivo, asum- iendo resultados determin sticos e informaci on completa. La plani caci on \satis cing" busca encontrar una soluci on de bajo coste, sin garant as de op- timalidad. La b usqueda heur stica guiada por heur sticas no admisibles es el enfoque que ha tenido mas exito. Esta tesis presenta varias heur sticas de ese g enero que consideran costes en las acciones, y por lo tanto encuentran soluciones que minimizan el coste, en lugar de la longitud del plan. Adem as, demostramos que el problema de plani caci on con \soft goals", u objetivos opcionales, se puede reducir a un problema de plani caci on clasica con costes en las acciones, escenario en el que heur sticas sensibles a costes, tal como las aqu presentadas, son esenciales.
APA, Harvard, Vancouver, ISO, and other styles
19

Chen, Chao, and Yogesh Vishwas Bhamare. "Life Cycle Cost Analysis and Optimization of Wastewater Pumping System." Thesis, KTH, Hållbar utveckling, miljövetenskap och teknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-255866.

Full text
Abstract:
Different attempts have been made to facilitate successful operation of Wastewater Pumping (WWP) system. The WWP units which are already existed in different parts of the world have been studied to identify its success, failure and different parameters associated with its suboptimal performance. The performance of WWP depends on three parameters namely pump, hydraulics, control system and pump station. These parameters are interdependent and must be carefully matched to achieve efficient WWP system. Nowadays the scenario has changed where organizations has started looking increasingly at the total cost of ownership, another way of saying Life Cycle Cost Analysis (LCCA) and recognizing the need to get most out of their equipment purchase. The master thesis includes theory part which describes the different parameters associated with WWP unit especially focusing on Xylems WWP system. This thesis is an attempt to help companies to know how LCCA could be productive management tool in order to minimize maintenance cost and maximize energy efficiency The study reported in this thesis work has been conducted to shed light over the use of Life Cycle Cost Analysis in WWP system. The current study tries to suggest and assess an adopted approach to ensure successful and efficient operation of WWP system with lowering energy demand and decrease in maintenance cost. Initial cost, Maintenance cost and Energy costs are important issues in the operation of WWP system since they are responsible for total cost over time. Therefore, description of each cost, formulas necessary for LCC calculations, data and survey structure, material and energy flow has been described. This work also aims to provide an extensive literature review, different survey and data collection techniques, analysis of collected data, statistical modelling, customer interaction by questionnaires and an interview with experts were used. LCC calculations were used to support the design and selection of most cost-efficient WWP system. Therefore, the given thesis work is an attempt to achieve better functional performance, improve existing design principles associated with WWP System, contribution to asses economic viability, support decision making to enhance operational quality to achieve efficient and successful WWP system.
APA, Harvard, Vancouver, ISO, and other styles
20

Jeppsson, Felix. "Kristus och de andra religionerna : En analys av andra religioners roll i kristen soteriologi hos tre kristna teologer." Thesis, Uppsala universitet, Teologiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-447393.

Full text
Abstract:
The purpose of this essay is to analyze views on the relationship between other religions and the Christian notion of salvation as they are presented in works by theologians Gavin D ́Costa, Kajsa Ahlstrand and Karl Rahner. To interpret their views, I use the fourfold model for theology of religions as it its presented by theologian Jakob Wirén. Because the purpose of this essay is to analyze the relation between soteriology and other religions I use the fourfold model solely to analyze the soteriological implications of their respective theology.  I have restricted my analysis to the three works Vägar – En öppen religionsdialog by Ahlstrand, The Meeting of the Religions and the Trinity by D ́Costa and Religiös Inklusivism by Rahner. These works represent different ways of framing theological content where Ahlstrands work tends to be descriptive while Rahners and D ́Costas tends to be normative in their approach to the subject.  The results of my analysis follow along the line as I conclude that Ahlstrands position within the fourfold model cannot quite be determined. However, I conclude that her views are not to be placed within the exclusivism spectrum even though her position cannot be specified further with regards to her text.  My analysis places both Rahner and D ́Costa within inclusivism but also shows that they represent to very different forms of inclusivism. Rahner sees other religions as righteous traditions where the one God may lead people to salvation earning these people the title anonymous christians. He however stretches that this implies that this people need to be made fully aware that they are in fact christians. D ́Costa on the other hand points out that the spirit acts freely outside the church where it may lead non-christians to salvation. He however stretches that the church must not try to domesticate the work of the spirit outside the church as it has no knowledge a priori on the full implications on where the activity of the spirit within other religions may lead to.  Therefore D ́Costa and Rahner are inclusivists in different ways where Rahner presents a traditional form of christian inclusivism while D ́Costa presents a modern form of inclusivism.  Keywords: Gavin D ́Costa, Kajsa Ahlstrand, Karl Rahner, theology of religions, soteriology.
APA, Harvard, Vancouver, ISO, and other styles
21

Šilhavý, Miroslav. "Sledování paprsku pomocí k-D tree." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2010. http://www.nusl.cz/ntk/nusl-237163.

Full text
Abstract:
This thesis deals with ray tracing methods and their acceleration. It gives partial study and review of algorithms from classical ray shooting algorithm to recursive approach up to distributed ray tracing algorithm. Significant part of this thesis is devoted to BSP tree structure and its subclass of k-D tree, it shows simple algorithm for its construction and traversal. The rest of thesis is dealing with k-D tree construction techniques, which are based on the right choice of the splitting plane inside the every cell of k-D tree. The techniques upon the thesis is based on are space median, object median and relatively new cost model technique named SAH, otherwise as surface area heuristic. All three techniques are put into testing and performance comparison. In the conclusion the results of tests are reviewed, from where SAH is coming out as a winner.
APA, Harvard, Vancouver, ISO, and other styles
22

Fadji, Sama Serena Dean. "What is the True Cost of Mass Polarization? : A Study of the Relationship Between Political Polarization and Trust in Political Institutions in the United States." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-79954.

Full text
Abstract:
Democracy is defined by the element of competition. Elite party competition has become one of the most discussed contemporary developments in the United States. Elected representatives from the main parties have become internally homogeneous, deepening the divide of ideologies between one another. This thesis seeks to establish the relationship between mass partisan polarization and the level of trust in political institutions across the United States. What happens when the public trusts the Elites more than Congress? Elite polarization has divided the masses so deeply in the U.S by electing representatives from the two major parties whom carry ideologies so distinct from another that the public begin change their ways of forming opinions. This thesis acknowledges that there is high elite and mass political polarization in the U.S., which is attributed to the heterogeneity in ideologies across the three main political parties (Democrats, Republicans and Independents) and intra-party homogeneity. The elite partisan theoretical framework expounds the relationship such that the public tends to hold a low level of trust towards the U.S. congress because majority of voters’ partisan motivated decision making is influenced by political endorsements. The implication is that the public is more likely to hold a considerable level of trust towards their political parties as opposed to the U.S. congress.
APA, Harvard, Vancouver, ISO, and other styles
23

Pech, Christian. "Kleene-Type Results for Weighted Tree-Automata." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2004. http://nbn-resolving.de/urn:nbn:de:swb:14-1083663461937-32692.

Full text
Abstract:
The main result of this thesis is the generalization of the Kleene-theorem to formal tree-series over commutative semirings (the Kleene theorem states the coincidence between rational and recognizable formal languages). To this end weighted tree-languages are introduced and the Kleene-theorem is proved for them. The desired result for formal tree-series is then obtained through application of a homomorphism that relates weighted tree-languages with formal tree-series. In the second part of the thesis the connections to the theorie of Iteration-theories are discovered. In particular it is shown there that the grove-theory of formal tree-series forms a partial iteration-theory
Hauptresultat dieser Arbeit ist die Verallgemeinerung des Satzes von Kleene über die Koinzidenz der rationalen und der erkennbaren Sprachen auf den Fall der formalen Baumreihen über kommutativen Semiringen. Zu diesem Zweck werden gewichtete Baumsprachen eingeführt, da sich diese ählich den klassischen Baumsprachen verhalten. Der Satz von Kleene wird also zunächst auf den Fall der gewichteten Baumsprachen verallgemeinert. Das erstrebte Resultat wird dann durch Anwendung eines Homomorphismus', der gewichteten Baumsprachen formle Baumreihen zuordnet, erhalten. Im zweiten Teil der Arbeit werden Kreuzverbindungen zur Theorie der Iterationstheorien aufgezeigt. Insbesondere wird z.B. gezeigt, dass die Grovetheorie der formalen Baumreihen eine partielle Iterationstheorie bildet
APA, Harvard, Vancouver, ISO, and other styles
24

Pech, Christian. "Kleene-Type Results for Weighted Tree-Automata." Doctoral thesis, Technische Universität Dresden, 2003. https://tud.qucosa.de/id/qucosa%3A24335.

Full text
Abstract:
The main result of this thesis is the generalization of the Kleene-theorem to formal tree-series over commutative semirings (the Kleene theorem states the coincidence between rational and recognizable formal languages). To this end weighted tree-languages are introduced and the Kleene-theorem is proved for them. The desired result for formal tree-series is then obtained through application of a homomorphism that relates weighted tree-languages with formal tree-series. In the second part of the thesis the connections to the theorie of Iteration-theories are discovered. In particular it is shown there that the grove-theory of formal tree-series forms a partial iteration-theory.
Hauptresultat dieser Arbeit ist die Verallgemeinerung des Satzes von Kleene über die Koinzidenz der rationalen und der erkennbaren Sprachen auf den Fall der formalen Baumreihen über kommutativen Semiringen. Zu diesem Zweck werden gewichtete Baumsprachen eingeführt, da sich diese ählich den klassischen Baumsprachen verhalten. Der Satz von Kleene wird also zunächst auf den Fall der gewichteten Baumsprachen verallgemeinert. Das erstrebte Resultat wird dann durch Anwendung eines Homomorphismus', der gewichteten Baumsprachen formle Baumreihen zuordnet, erhalten. Im zweiten Teil der Arbeit werden Kreuzverbindungen zur Theorie der Iterationstheorien aufgezeigt. Insbesondere wird z.B. gezeigt, dass die Grovetheorie der formalen Baumreihen eine partielle Iterationstheorie bildet.
APA, Harvard, Vancouver, ISO, and other styles
25

Ramqvist, Louise, and Linn Johansson. "ANBUD ELLER SKAMBUD : Tre studier om ändringar och tilläggsarbeten som uppstår i samband med entreprenadupphandlingar till en statlig myndighet." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-63788.

Full text
Abstract:
Upphandling är en central del för att styra statliga resurser och hur de allokeras. Syftet är att förklara avtalsprecision av komplexa byggentreprenader. För att uppnå syftet kommer vi att studera byggprojekt där upphandlingar är utvecklade. Studie 1 besvarar en teoretisk hypotes om att stora projekt skapar fler ÄTA-arbeten. Hypotesen testas med registerdata från en offentlig myndighets interna projektdatabas. Studie 2 söker förklaringar till varför stora projekt driver ÄTA-arbeten. Förklaringar kommer från intervjuer med projektledare och entreprenörer. Studie 3 testas sex olika förklaringar vi benämner post-hoc hypoteser (PHH) som bygger på etablerade begrepp. Testet bygger på enkätdata för beställare och entreprenörer. Den interna projektdatabasen innehöll 486 infrastrukturprojekt (Studie 1). Baserad på projektdatabasen valdes fyra projekt ut för vidare analys. Totalt intervjuades fyra beställare och två entreprenörer kopplat till de utvalda projekten (Studie 2). Totalt deltog 208 respondenter i enkäten varav 87 beställare, 116 entreprenörer och 5 underentreprenörer som kompletterades av två externa entreprenörer (Studie 3).  Resultatet från Studie 1 ger indikation på att dokumentationen, uppföljningen och redovisningen är bristfällig. Detta bekräftades av intervjuerna i Studie 2. I Studie 3 fann vi att projekteringen är bristfällig i stora projekt där parterna agerar missvisande. Resultatet visar på att ÄTA-arbeten drivs av ofullständig projektering, person och ledarskap och av en medveten affärsmodell. Genom bättre dokumentation, uppföljning och redovisning skapas bättre avtalsprecision i upphandlingar och bidrar till meningsfulla processer och en träffsäkrare resursallokering.
Procurement is a key part of controlling government resources and how they are allocated. The purpose of this thesis is to explain the contractual precision of complex construction contracts. In order to achieve our objective, we will study construction projects where the procurement is well developed. We pursue this objective based on three studies. Study 1 answers a theoretical hypothesis that large projects create more cost overruns/error. The hypothesis is tested on a project database which is an internal register data from a public bureau. Study 2 seeks explanations as to why large projects drive cost overruns. Explanations come from interviews with project managers and contractors. Study 3 tests six different explanations we refer to post-hoc hypotheses (PHH) based on established concepts. The test is based on survey data for clients and contractors. The project database contained 486 infrastructure projects (Study 1). Based on the project database, four projects were selected for further analysis. A total of four clients and two contractors were interviewed in connection with the selected projects (Study 2). A total of 208 respondents participated in the survey, of which 87 clients, 116 contractors and 5 subcontractors were supplemented by two external contractors (Study 3). The result from study 1 indicates that the documentation, follow-up and reporting are inadequate and confirmed much of the findings in Study 2. In Study 3 we found that the planning is inadequate in large projects and often misleading. The result shows that cost overruns/error is driven by person and leadership, incomplete planning and a conscious business model. Through better documentation, follow-up and accounting, better contractual precision is created in procurement and contributes to meaningful processes and more accurate resource allocation.
APA, Harvard, Vancouver, ISO, and other styles
26

López, Juan Carlos Flores. "Exploring the potential of sound management of forest and tree resources on cattle farms located in tropical dry forest of Guanacaste, Costa Rica." Thesis, Bangor University, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432792.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Cheepweasarash, Piansiri, and Sarinthorn Pakapongpan. "A Feasibility Study of Setting-up New Production Line : Either Partly Outsource a process or Fully Produce In-House." Thesis, Mälardalen University, Department of Innovation, Design and Product Development, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-746.

Full text
Abstract:

This paper presents the feasibility study of setting up the new potting tray production line based on the two alternatives: partly outsource a process in the production line or wholly make all processes in-house. Both the qualitative and quantitative approaches have been exploited to analyze and compare between the make or buy decision. Also the nature of business, particularly SMEs, in Thailand has been presented, in which it has certain characteristics that influence the business doing and decision, especially to the supply chain management. The literature relating to the forecasting techniques, outsourcing decision framework, inventory management, and investment analysis have been reviewed and applied with the empirical findings. As this production line has not yet been in place, monthly sales volumes are forecasted within the five years time frame. Based on the forecasted sales volume, simulations are implemented to distribute the probability and project a certain demand required for each month. The projected demand is used as a baseline to determine required safety stock of materials, inventory cost, time between production runs and resources utilization for each option. Finally, in the quantitative analysis, the five years forecasted sales volume is used as a framework and several decision making-techniques such as break-even analysis, cash flow and decision trees are employed to come up with the results in financial aspects.

APA, Harvard, Vancouver, ISO, and other styles
28

Zhu, Weiqi, and ycqq929@gmail com. "An Investigation into Reliability Based Methods to Include Risk of Failure in Life Cycle Cost Analysis of Reinforced Concrete Bridge Rehabilitation." RMIT University. Civil, Environmental and Chemical Engineering, 2008. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20080822.140447.

Full text
Abstract:
Reliability based life cycle cost analysis is becoming an important consideration for decision-making in relation to bridge design, maintenance and rehabilitation. An optimal solution should ensure reliability during service life while minimizing the life cycle cost. Risk of failure is an important component in whole of life cycle cost for both new and existing structures. Research work presented here aimed to develop a methodology for evaluation of the risk of failure of reinforced concrete bridges to assist in decision making on rehabilitation. Methodology proposed here combines fault tree analysis and probabilistic time-dependent reliability analysis to achieve qualitative and quantitative assessment of the risk of failure. Various uncertainties are considered including the degradation of resistance due to initiation of a particular distress mechanism, increasing load effects, changes in resistance as a result of rehabilitation, environmental variables, material properties and model errors. It was shown that the proposed methodology has the ability to provide users two alternative approaches for qualitative or quantitative assessment of the risk of failure depending on availability of detailed data. This work will assist the managers of bridge infrastructures in making decisions in relation to optimization of rehabilitation options for aging bridges.
APA, Harvard, Vancouver, ISO, and other styles
29

Badraghi, Naghimeh. "Productivity, Cost and Environmental Damage of Four Logging Methods in Forestry of Northern Iran." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-145790.

Full text
Abstract:
Increasing productivity, reducing cost, reducing soil damage, reducing the impact of harvesting on standing tree and regeneration are all very important objectives in ground skidding system in the management of the Hyrcanian forest. The research carried out to obtain these objectives included four logging methods, tree length method (TLM), long length method (LLM), short length method (SLM), and wood extraction by mule (mule) in northern Iran. In order to determine the cost per unit, time study techniques were used for each harvesting method, time study data are shifted to logarithmic data based on 10. On the basis of the developed models simulated, 11 skidding turns are simulated and the unit cost are estimated depending on the diameter of the log (DL), skidding distance (SD), and the winching distance (WD) for 11 different cycles with TLM, LLM and SLM. The results showed that on average, the net costs per extraction of one cubic meter of wood were 3.06, 5.69, 6.81 and 34.36 €/m3 in TLM, LLM, SLM and mule. The costs depending on diameter of log (DL), skidding distance (SD) and winching distance (WD) showed that the most economical alternative for Northern Iran is TLM. In the cut-to-length system, the costs of both alternatives LLM, SLM were significantly dependent on DL. , thus the result of this study suggests that as long as the diameter of the felled trees is less than 40 cm, the cut-to-length system is not an economical alternative, whilst the cut-to-length method can be applied for trees with a diameter more than 40 cm. Where diameters are more than 40 cm TLM it is more economical than SLM, however it was not significantly different. Depending on SD in short skidding distance SLM is preferable to LLM but in cases of long skidding distance LLM is more economical than SLM. The winching distance affect was not a factor on cost. To assess the damage on seedlings and standing trees a 100% inventory method was employed in pre-hauling and post-hauling, alongside of skidding trails, winching strips and mule hauling with a 12m width. To chose the best alternative depending on standing damage the Analysis of multiple criterial approval (MA) was applied. The amount of trees damaged by winching operation were 11.89% in TLM, 14.44% in LLM 27.59%, SLM and 0 stem and by skidding operation were 16.73%, 3.13% and 8.78% of total trees in TLM, LLM and SLM. In the winching area about 14%, 20%, 21% and 6 % of the total regeneration was damaged by TLM, LLM, SLM and mule and the skidding operation damaged 7.5% in TLM, 7.4 % LLM and 9.4% in SLM. The friendliest alternative to residual standing was mule but in manual method (where the wood extraction is done by skidder) MA showed that the best alternative depending on residual damage is LLM. To determine the degree of soil compaction a core sampling technique of bulk density was used. Soil samples collected from the horizontal face of a soil pit at 10 cm depth soil core, at 50m intervals on skid trials, in winching strips and control are (no vehicles pass) a soil sample was taken at 10m intervals in the hauling direction of the mule. In order to determine the post-harvesting extent of disturbance on skidding trails by skidding operations, the disturbed widths were measured at 50 m intervals along the skid trails. In the winching area, where the winched logs created a streak of displaced soil, the width of the displaced streak was measured at 5 m interval along the winching strip. In mule hauling operations the width of a streak created by a mule foot track was measured at 10 m intervals. To compare increased average bulk density between alternatives one way The ANOVA, Duncan test and Dunnett t-test with a 95 % confidence level were used. A General linear model was applied to relate the increasing bulk density and the slope gradient. To realize the correlation between the increment of soil bulk density and the slope gradient and the correlation between the soil compaction and soil moisture content (%) The Pearson correlation test was applied. To choose the best alternative (in manual method) a MA test was applied again. The bulk density on the skidding trail increased 51 % for 30 skidding turn, 35 % for 31 skidding turn (one unloaded and one loaded pass) and 46% for 41 skidding turn. Results of ANOVA (p < 0.05) show significant differences of bulk density between alternatives. Duncan test and the Dunnett t-test indicated that the increasing soil bulk density was not significant between control samples and winching strip of TLM and extraction by mule samples. The general linear modeling and Pearson correlation test results indicated that the slope gradient had an insignificant effect on soil compaction, whilst the Pearson test indicates a medium negative correlation between soil compaction and percentage of soil moisture. By ground-based winching operation 0.07%, 0.03%, 0.05% and 0.002% of the total area and by ground based skidding operation 1.21%, 1.67%, 0.81% and 0.00% of total area was disturbed and compacted in TLM, LLM, SLM and mule. The Pearson correlation results show that the width of disturbed area was significantly influenced by the diameter of logs and length of logs (p ˂ 0.05), but there is no significant correlation between soil disturbance width and slope. The results of analysis of MA showed that soil compaction was not related to logging method but sensitivity analysis of MA shows that LLM and TLM are both preferable to SLM.
APA, Harvard, Vancouver, ISO, and other styles
30

Lunday, Brian Joseph. "Resource Allocation on Networks: Nested Event Tree Optimization, Network Interdiction, and Game Theoretic Methods." Diss., Virginia Tech, 2010. http://hdl.handle.net/10919/77323.

Full text
Abstract:
This dissertation addresses five fundamental resource allocation problems on networks, all of which have applications to support Homeland Security or industry challenges. In the first application, we model and solve the strategic problem of minimizing the expected loss inflicted by a hostile terrorist organization. An appropriate allocation of certain capability-related, intent-related, vulnerability-related, and consequence-related resources is used to reduce the probabilities of success in the respective attack-related actions, and to ameliorate losses in case of a successful attack. Given the disparate nature of prioritizing capital and material investments by federal, state, local, and private agencies to combat terrorism, our model and accompanying solution procedure represent an innovative, comprehensive, and quantitative approach to coordinate resource allocations from various agencies across the breadth of domains that deal with preventing attacks and mitigating their consequences. Adopting a nested event tree optimization framework, we present a novel formulation for the problem as a specially structured nonconvex factorable program, and develop two branch-and-bound schemes based respectively on utilizing a convex nonlinear relaxation and a linear outer-approximation, both of which are proven to converge to a global optimal solution. We also investigate a fundamental special-case variant for each of these schemes, and design an alternative direct mixed-integer programming model representation for this scenario. Several range reduction, partitioning, and branching strategies are proposed, and extensive computational results are presented to study the efficacy of different compositions of these algorithmic ingredients, including comparisons with the commercial software BARON. The developed set of algorithmic implementation strategies and enhancements are shown to outperform BARON over a set of simulated test instances, where the best proposed methodology produces an average optimality gap of 0.35% (compared to 4.29% for BARON) and reduces the required computational effort by a factor of 33. A sensitivity analysis is also conducted to explore the effect of certain key model parameters, whereupon we demonstrate that the prescribed algorithm can attain significantly tighter optimality gaps with only a near-linear corresponding increase in computational effort. In addition to enabling effective comprehensive resource allocations, this research permits coordinating agencies to conduct quantitative what-if studies on the impact of alternative resourcing priorities. The second application is motivated by the author's experience with the U.S. Army during a tour in Iraq, during which combined operations involving U.S. Army, Iraqi Army, and Iraqi Police forces sought to interdict the transport of selected materials used for the manufacture of specialized types of Improvised Explosive Devices, as well as to interdict the distribution of assembled devices to operatives in the field. In this application, we model and solve the problem of minimizing the maximum flow through a network from a given source node to a terminus node, integrating different forms of superadditive synergy with respect to the effect of resources applied to the arcs in the network. Herein, the superadditive synergy reflects the additional effectiveness of forces conducting combined operations, vis-à-vis unilateral efforts. We examine linear, concave, and general nonconcave superadditive synergistic relationships between resources, and accordingly develop and test effective solution procedures for the underlying nonlinear programs. For the linear case, we formulate an alternative model representation via Fourier-Motzkin elimination that reduces average computational effort by over 40% on a set of randomly generated test instances. This test is followed by extensive analyses of instance parameters to determine their effect on the levels of synergy attained using different specified metrics. For the case of concave synergy relationships, which yields a convex program, we design an inner-linearization procedure that attains solutions on average within 3% of optimality with a reduction in computational effort by a factor of 18 in comparison with the commercial codes SBB and BARON for small- and medium-sized problems; and outperforms these softwares on large-sized problems, where both solvers failed to attain an optimal solution (and often failed to detect a feasible solution) within 1800 CPU seconds. Examining a general nonlinear synergy relationship, we develop solution methods based on outer-linearizations, inner-linearizations, and mixed-integer approximations, and compare these against the commercial software BARON. Considering increased granularities for the outer-linearization and mixed-integer approximations, as well as different implementation variants for both these approaches, we conduct extensive computational experiments to reveal that, whereas both these techniques perform comparably with respect to BARON on small-sized problems, they significantly improve upon the performance for medium- and large-sized problems. Our superlative procedure reduces the computational effort by a factor of 461 for the subset of test problems for which the commercial global optimization software BARON could identify a feasible solution, while also achieving solutions of objective value 0.20% better than BARON. The third application is likewise motivated by the author's military experience in Iraq, both from several instances involving coalition forces attempting to interdict the transport of a kidnapping victim by a sectarian militia as well as, from the opposite perspective, instances involving coalition forces transporting detainees between interment facilities. For this application, we examine the network interdiction problem of minimizing the maximum probability of evasion by an entity traversing a network from a given source to a designated terminus, while incorporating novel forms of superadditive synergy between resources applied to arcs in the network. Our formulations examine either linear or concave (nonlinear) synergy relationships. Conformant with military strategies that frequently involve a combination of overt and covert operations to achieve an operational objective, we also propose an alternative model for sequential overt and covert deployment of subsets of interdiction resources, and conduct theoretical as well as empirical comparative analyses between models for purely overt (with or without synergy) and composite overt-covert strategies to provide insights into absolute and relative threshold criteria for recommended resource utilization. In contrast to existing static models, in a fourth application, we present a novel dynamic network interdiction model that improves realism by accounting for interactions between an interdictor deploying resources on arcs in a digraph and an evader traversing the network from a designated source to a known terminus, wherein the agents may modify strategies in selected subsequent periods according to respective decision and implementation cycles. We further enhance the realism of our model by considering a multi-component objective function, wherein the interdictor seeks to minimize the maximum value of a regret function that consists of the evader's net flow from the source to the terminus; the interdictor's procurement, deployment, and redeployment costs; and penalties incurred by the evader for misperceptions as to the interdicted state of the network. For the resulting minimax model, we use duality to develop a reformulation that facilitates a direct solution procedure using the commercial software BARON, and examine certain related stability and convergence issues. We demonstrate cases for convergence to a stable equilibrium of strategies for problem structures having a unique solution to minimize the maximum evader flow, as well as convergence to a region of bounded oscillation for structures yielding alternative interdictor strategies that minimize the maximum evader flow. We also provide insights into the computational performance of BARON for these two problem structures, yielding useful guidelines for other research involving similar non-convex optimization problems. For the fifth application, we examine the problem of apportioning railcars to car manufacturers and railroads participating in a pooling agreement for shipping automobiles, given a dynamically determined total fleet size. This study is motivated by the existence of such a consortium of automobile manufacturers and railroads, for which the collaborative fleet sizing and efforts to equitably allocate railcars amongst the participants are currently orchestrated by the \textit{TTX Company} in Chicago, Illinois. In our study, we first demonstrate potential inequities in the industry standard resulting either from failing to address disconnected transportation network components separately, or from utilizing the current manufacturer allocation technique that is based on average nodal empty transit time estimates. We next propose and illustrate four alternative schemes to apportion railcars to manufacturers, respectively based on total transit time that accounts for queuing; two marginal cost-induced methods; and a Shapley value approach. We also provide a game-theoretic insight into the existing procedure for apportioning railcars to railroads, and develop an alternative railroad allocation scheme based on capital plus operating costs. Extensive computational results are presented for the ten combinations of current and proposed allocation techniques for automobile manufacturers and railroads, using realistic instances derived from representative data of the current business environment. We conclude with recommendations for adopting an appropriate apportionment methodology for implementation by the industry.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
31

Li, Wanbo. "Rapid and low-cost mass fabrication of true three-dimensional hierarchical structures with dynamic soft molding and its application in affordable and scalable production of robust and durable whole-teflon superhydrophobic coating." HKBU Institutional Repository, 2019. https://repository.hkbu.edu.hk/etd_oa/611.

Full text
Abstract:
Superhydrophobic (SH) surfaces equipped on the skins of natural living beings give them trumps of self-cleaning, anti-bacterial, water harvest, and directional liquid transport, etc., to survive in harsh environments. Bioinspired Superhydrophobic (SH) surfaces have developed many emerging functions, such as self-cleaning, anti-bacterial, water harvest, anti-icing, anti-corrosion, oil-water separation, and many other fascinating functions. However, the implementations of SH coating in real world are still in its infancy, due to (i) the poor performance in the harsh real-world environment and industrial process application, where a multi-level robustness including the mechanical, chemical, and thermal robustness, as well as the strong adherent strength to substrates, is strictly required; (ii) the lack of a technology for facile and mass production. In the light of that any non-perfluorinated component in the formula of an SH coating inevitably generates vulnerable points to the external invasions and the functional applications of SH coatings require control surface topography, we here propose an SH coating entirely made of perfluorinated materials (referred to as Teflon). To achieve this goal, we developed a complete strategy involving material, fabrication, and applications. Firstly, we developed a feasible dynamic soft molding method for the fabrication of three-dimensional (3D) structures. This method paves a road not only to the fabrication of whole-Teflon SH coatings but also to the practical adoption of many other important technologies based on 3D structures. Secondly, we generated whole-Teflon and multi-resist SH coatings by using this method and tightly attached them to different substrates with superior adhering strength surpassing the conventional work. Thirdly, we performed a proof-of-concept demonstration of a roll-to-roll (R2R) hot molding process, which has the potential of translating the lab-scale and plate-to-plate fabrication to industrial mass production. Finally, some fundamental mechanisms and problems of the multifunctional applications in self-cleaning, anti-bacterial fouling, and anti-icing are studied. The outcomes are expected to provide insight understandings on the multifunctional SH coating and move SH coatings toward real-world application.
APA, Harvard, Vancouver, ISO, and other styles
32

Sciauveau, Marion. "Asymptotiques de fonctionnelles d'arbres aléatoires et de graphes denses aléatoires." Thesis, Paris Est, 2018. http://www.theses.fr/2018PESC1127/document.

Full text
Abstract:
L'objectif de cette thèse est l'étude des approximations et des vitesses de convergence pour des fonctionnelles de grands graphes discrets vers leurs limites continues. Nous envisageons deux cas de graphes discrets: des arbres (i.e. des graphes connexes et sans cycles) et des graphes finis, simples et denses. Dans le premier cas, on considère des fonctionnelles additives sur deux modèles d'arbres aléatoires: le modèle de Catalan sur les arbres binaires (où un arbre est choisi avec probabilité uniforme sur l'ensemble des arbres binaires complets ayant un nombre de nœuds donné) et les arbres simplement générés (et plus particulièrement les arbres de Galton-Watson conditionnés par leur nombre de nœuds).Les résultats asymptotiques reposent sur les limites d'échelle d'arbres de Galton-Watson conditionnés. En effet, lorsque la loi de reproduction est critique et de variance finie (ce qui est le cas des arbres binaires de Catalan), les arbres de Galton-Watson conditionnés à avoir un grand nombre de nœuds convergent vers l'arbre brownien continu qui est un arbre réel continu qui peut être codé par l'excursion brownienne normalisée. Par ailleurs, les arbres binaires sous le modèle de Catalan peuvent être construits comme des sous arbres de l'arbre brownien continu. Ce plongement permet d'obtenir des convergences presque-sûres de fonctionnelles. Plus généralement, lorsque la loi de reproduction est critique et appartient au domaine d'attraction d'une loi stable, les arbres de Galton-Watson conditionnés à avoir un grand nombre de nœuds convergent vers des arbres de Lévy stables, ce qui permet d'obtenir le comportement asymptotique des fonctionnelles additives pour certains arbres simplement générés. Dans le second cas, on s'intéresse à la convergence de la fonction de répartition empirique des degrés ainsi qu'aux densités d'homomorphismes de suites de graphes finis, simples et denses. Une suite de graphes finis, simples, denses converge si la suite réelle des densités d'homomorphismes associées converge pour tout graphe fini simple. La limite d'une telle suite de graphes peut être décrite par une fonction symétrique mesurable appelée graphon. Etant donné un graphon, on peut construire par échantillonnage, une suite de graphes qui converge vers ce graphon. Nous avons étudié le comportement asymptotique de la fonction de répartition empirique des degrés et de mesures aléatoires construites à partir des densités d'homomorphismes associées à cette suite particulière de graphes denses
The aim of this thesis is the study of approximations and rates of convergence for functionals of large dicsrete graphs towards their limits. We contemplate two cases of discrete graphs: trees (i.e. connected graphs without cycles) and dense simple finite graphs. In the first case, we consider additive functionals for two models of random trees: the Catalan model for binary trees (where a tree is chosen uniformly at random from the set of full binary trees with a given number of nodes) and the simply generated trees (and more particulary the Galton-Watson trees conditioned by their number of nodes).Asymptotic results are based on scaling limits of conditioned Galton-Watson trees. Indeed, when the offspring distribution is critical and with finite variance (that is the case of Catalan binary trees), the Galton-Watson trees conditioned to have a large number of nodes converge towards the Brownian continuum tree which is a real tree coded which can be coded by the normalized Brownian excursion. Furthermore, binary trees under the Catalan model can be built as sub-trees of the Brownian continuum tree. This embedding makes it possible to obtain almost sure convergences of functionals. More generally, when the offspring distribution is critical and belongs to the domain of attraction of a stable distribution, the Galton-Watson trees conditioned to have a large number of nodes converge to stable Levy trees giving the asymptotic behaviour of additive functionals for some simply generated trees. In the second case, we are interested in the convergence of the empirical cumulative distribution of degrees and the homomorphism densities of sequences of dense simple finite graphs. A sequence of dense simple finite graphs converges if the real sequence of associated homomorphism densities converges for all simple finite graph. The limit of such a sequence of dense graphs can be described as a symmetric measurable function called graphon.Given a graphon, we can construct by sampling, a sequence of graphs which converges towards this graphon. We have studied the asymptotic behaviour of the empirical cumulative distribution of degrees and random measures built from homomorphism densities associated to this special sequence of dense graphs
APA, Harvard, Vancouver, ISO, and other styles
33

Caralp, Mathieu. "Problèmes de bornes pour les automates et les transducteurs à pile visible." Thesis, Aix-Marseille, 2015. http://www.theses.fr/2015AIXM4118/document.

Full text
Abstract:
L’étude des automates est un sujet fondamental de l’informatique. Ce modèle apporte des solutions pratiques à divers problèmes en compilation et en vérification notamment. Dans ce travail nous proposons l'extension aux automates à pile visible de résultats existants pour les automates. Nous proposons une définition d'automate à pile visible émondé et donnons un algorithme s’exécutant en temps polynomial émondant un automate en préservant son langage. Nous donnons aussi un algorithme de complexité exponentielle qui, pour un automate à pile visible donné, construit un automate équivalent à la fois émondé et déterministe. Cette complexité exponentielle se révèle optimale. Étant donné un automate à pile visible, nous pouvons associer à ses transitions des coûts pris dans un semi-anneau S. L’automate associe ainsi un mot d’entrée à un élément de S. Le coût d’un automate est le supremum des coûts associés aux mots d'entrée. Pour les semi-anneaux des entiers naturels et Max-plus, nous donnons des caractérisations et des algorithmes polynomiaux pour décider si le coût d’un automate est fini. Puis, nous étudions pour les entiers naturels la complexité du problème de la majoration du coût par un entier k. Les transducteurs à pile visibles produisent des sorties sur chaque mot accepté. Un problème classique est de décider s'il existe une borne sur le nombre de sorties de chaque mot accepté. Pour une sous-classe des transducteurs à pile visible, nous proposons des propriétés caractérisant les instances positives de ce problème. Nous montrons leur nécessité et discutons d’approches possibles afin de montrer leur suffisance
The study of automata is a central subject of computer science. This model provides practical solutions to several problems including compilation and verification. In this work we extend existing results of automata to visibly pushdown automata. We give a definition of trimmed visibly pushdown automata and a polynomial time algorithm to trim an automata while preserving its language. We also provide an exponential time algorithm which, given a visibly pushdown automaton, produces an equivalent automaton, both deterministic and trimmed. We prove the optimality of the complexity. Given a visibly pushdown automaton, we can equip its transitions with a cost taken from a semiring S, and thus associate each input word to an element of S. The cost of the automaton is the supremum of the input words cost. For the semiring of natural integers and Max-plus, we give characterisations and polynomial time algorithms to decide if the cost of a visibly pushdown automaton is finite. Then in the case of natural integers we study the complexity of deciding if the cost is bounded by a given integer k. Visibly pushdown transducers produce output on each accepted word. A classical problem is to decide if there exists a bound on the number of outputs of each accepted word. In the case of a subclass of visibly pushdown transducers, we give properties characterizing positive instances of this problem. We show their necessity and discuss of possible approaches to prove their sufficiency
APA, Harvard, Vancouver, ISO, and other styles
34

Oshiro, Marcio Takashi Iura. "k-árvores de custo mínimo." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-28052012-091652/.

Full text
Abstract:
Esta dissertação trata do problema da k-árvore de custo mínimo (kMST): dados um grafo conexo G, um custo não-negativo c_e para cada aresta e e um número inteiro positivo k, encontrar uma árvore com k vértices que tenha custo mínimo. O kMST é um problema NP-difícil e portanto não se conhece um algoritmo polinomial para resolvê-lo. Nesta dissertação discutimos alguns casos em que é possível resolver o problema em tempo polinomial. Também são estudados algoritmos de aproximação para o kMST. Entre os algoritmos de aproximação estudados, apresentamos a 2-aproximação desenvolvida por Naveen Garg, que atualmente é o algoritmo com melhor fator de aproximação.
This dissertation studies the minimum cost k-tree problem (kMST): given a connected graph G, a nonnegative cost function c_e for each edge e and a positive integer k, find a minimum cost tree with k vertices. The kMST is an NP-hard problem, which implies that it is not known a polynomial algorithm to solve it. In this dissertation we discuss some cases that can be solved in polynomial time. We also study approximation algorithms for the kMST. Among the approximation algorithms we present the 2-approximation developed by Naveen Garg, which is currently the algorithm with the best approximation factor.
APA, Harvard, Vancouver, ISO, and other styles
35

González, Barrameda José Andrés. "Novel Application Models and Efficient Algorithms for Offloading to Clouds." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/36469.

Full text
Abstract:
The application offloading problem for Mobile Cloud Computing aims at improving the mobile user experience by leveraging the resources of the cloud. The execution of the mobile application is offloaded to the cloud, saving energy at the mobile device or speeding up the execution of the application. We improve the accuracy and performance of application offloading solutions in three main directions. First, we propose a novel fine-grained application model that supports complex module dependencies such as sequential, conditional and parallel module executions. The model also allows for multiple offloading decisions that are tailored towards the current application, network, or user contexts. As a result, the model is more precise in capturing the structure of the application and supports more complex offloading solutions. Second, we propose three cost models, namely, average-based, statistics-based and interval-based cost models, defined for the proposed application model. The average-based approach models each module cost by the expected cost value, and the expected cost of the entire application is estimated considering each of the three module dependencies. The novel statistics-based cost model employs Cumulative Distribution Function (CDFs) to represent the costs of the modules and of the mobile application, which is estimated considering the cost and dependencies of the modules. This cost model opens the doors for new statistics-based optimization functions and constraints whereas the state of the art only support optimizations based on the average running cost of the application. Furthermore, this cost model can be used to perform statistical analysis of the performance of the application in different scenarios such as varying network data rates. The last cost model, the interval-based, represents the module costs via intervals in order to addresses the cost uncertainty while having lower requirements and computational complexity than the statistics-based model. The cost of the application is estimated as an expected maximum cost via a linear optimization function. Finally, we present offloading decision algorithms for each cost model. For the average-based model, we present a fast optimal dynamic programming algorithm. For the statistics-based model, we present another fast optimal dynamic programming algorithm for the scenario where the optimization function meets specific properties. Finally, for the interval-based cost model, we present a robust formulation that solves a linear number of linear optimization problems. Our evaluations verify the accuracy of the models and show higher cost savings for our solutions when compared to the state of the art.
APA, Harvard, Vancouver, ISO, and other styles
36

Roque, Roger Alonso Moya. "Variação da anatomia e da densidade básica da madeira de Gmelina arborea (Roxb.), em diferentes condições de clima e de manejo na Costa Rica." Universidade de São Paulo, 2005. http://www.teses.usp.br/teses/disponiveis/11/11150/tde-18082005-164402/.

Full text
Abstract:
A Gmelina arborea Roxb. (Verbenaceceae) tem sido plantadas em países de clima tropical devido a sua elevada taxa de crescimento, resistência ao ataque de pragas e doenças e aos curtos ciclos de rotação. Foi introduzida no Costa Rica no início dos anos de 1970 e rapidamente difundiu-se sua utilização nos programas de reflorestamento, compreendendo diferentes regiões ecológicas e intensidade de manejo para a produção de madeira para construção civil, movelaria, entre outros. Por outro lado, as características anatômicas e a densidade da madeira de árvores de gmelina procedentes das diferentes condições ecológicas na Costa Rica são pouco pesquisadas. Pela importância do conhecimento destas propriedades da madeira, a presente pesquisa teve como objetivos: descrever a estrutura anatômica do seu lenho, a variação radial dos elementos celulares (vasos, fibras e parênquimas radial e longitudinal) e a densidade da madeira (densitometria de raios-X) de duas diferentes regiões ecológicas (climas tropical úmido e seco) e três intensidades de manejo florestal. Os resultados da presente pesquisa são apresentados em 3 capítulos, a saber: No Capítulo 1 são descritas a estrutura anatômica macro e microscópica do lenho da gmelina procedente das duas regiões climáticas. Os resultados da análise multivariada de componentes principais mostraram que os vasos, parênquima paratraqueal e radial são os elementos anatômicos mais afectados pela variação das condições ecológicas e que as variações macro-microscópica do lenho podem ser explicadas por quatro componentes principais. No Capítulo 2 é apresentada a variação radial das dimensões das fibras, dos vasos, parênquimas longitudinal e radial. Os resultados representam uma amostra de trinta árvores de gmelina procedentes de duas regiões climáticas e três intensidades de manejo, evidenciando a ocorrência de variação radial de todos os parâmetros anatômicos, à excepção do diâmetro do lume das fibras e presença de vasos múltiplos. As variações das dimensões dos elementos anatômicos do lenho em relação ao clima (tropical úmido e seco), posição geográfica, nível de precipitação e das dimensões das árvores de gmelina, são também analisadas. No Capítulo 3 é apresentada a variação radial da densidade da madeira pela técnica de densitometria de raios-X para as trinta árvores de gmelina procedentes das duas regiões climáticas e três intensidades de manejo. A densidade média do lenho aumentou no sentido medula-casca para essas condições de clima e de manejo, com a posição geográfica, precipitação e dimensões das árvores de gmelina afetando esta propriedade. As densidades máxima e mínima da madeira não foram afetadas pela idade e dimensões das árvores de gmelina, clima, manejo e pelas coordenadas geográficas. A variação de densidade intra-anel de crescimento foi afetada, da mesma forma, observando-se uma diminuição com a idade das árvores. Os resultados do presente trabalho contribuem para um melhor conhecimento da estrutura anatômica da madeira das árvores de gmelina cultivadas na Costa Rica, proporcionando informações sobre a sua variabilidade e influência das condições de clima e de manejo florestal e possibilitando a prognose da qualidade da madeira a ser produzida, bem como a aplicação tecnicamente recomendada.
Gmelina arborea had been introduced in the tropical area due to excellent growth rate, its resistance to pests and diseases, its fast growth and wood suitability as pulp and raw material for solid products in very fast time. Gmelina was introduced in Costa Rica in 1970 used for raw material for lumber, furniture and civil constructions and have been planted in different ecological zones and under different silvicultural management regimes. In the other hand, the anatomical features and wood density knowledge of gmelina from Costa Rica are unknown and few researched. For this reason, this research presents the macro and micro wood description, the woody cells variation and wood density from x-ray technique for trees from different ecological conditions (dry and wet tropical climate) in three silvicultural management regimes. The anatomical-technology description has the objective of increase the knowledge about gmelina wood from fast-growing plantation of different growing ecological and management conductions. The results of this research are presented in 3 parts and are detailed subsequently: In the part 1: macro and micro anatomical description of gmelina wood are detailed of trees from two climatic conductions. These descriptions were agreed with anatomical description for other countries. Multivariate analysis for principal components demonstrated that the vessels, longitudinal parenchyma and radial parenchyma were the anatomical features highest affect for ecological condition variation. Four principal components explained 91,74% of the total variation of the macro and microscopic variation. In the part 2: the fibers, vessels, longitudinal parenchyma percentage and rays were demonstrated its variation from pith to bark in this part. Thirty adult trees from 2 climatic and 3 silvicultural management regimes (5 for each intensive management) were sampled and were measured the fiber dimensions for each growth ring and the remaing woody cells were determinated at 0, 25, 50, 75 and 100% between pith of bark. There were variation from the pith to bark for anatomical features measured, with the exception to lumen diameter and multiple vessels presence. The anatomical features were affected for climatic conditions, geographical position, rainfall/year and tree dimensions sampled. In the part 3: The mean, minimum and maximum wood density and intra-ring density variation from x-ray techniques for the same 30 adult trees sampled at 2 climatic conditions and 3 silvicultural management regimes were determinated and its variation from pith to bark. The mean density increased to increasing tree age in two climatic conditions and all management regimes and these densities were affected for geographical positions, rainfall and tree dimensions sampled. Although the maximum and minimum density were not affected for tree age, climatic type, silvicultural management, dimensions of trees or ecological and geographical conditions. The intra-ring density variation decreased for increasing tree age and was produced for vessel percentages, fiber length, lumen diameter and cell wall thickness variation across the growth rings. The result increases the knowledge about anatomical structures of gmelina wood from fast growing plantations in Costa Rica. There are presented informations about variability and climatic and management effects on wood anatomy. There is possibility to predict wood quality and its future end-uses.
APA, Harvard, Vancouver, ISO, and other styles
37

Esen, Derya. "Ecology and Control of Rhododendron (Rhododendron ponticum L.) in Turkish Eastern Beech (Fagus orientalis Lipsky) Forests." Diss., Virginia Tech, 2000. http://hdl.handle.net/10919/29098.

Full text
Abstract:
Purple-flowered rhododendron (Rhododendron ponticum L.) and yellow-flowered rhododendron (R. flavum Don.) are two dominant shrub species of the eastern beech (Fagus orientalis L.) understories in the eastern and western Black Sea Region (BSR), respectively. These invasive woody species significantly reduce beech growth and can preclude tree regeneration. The ecological consequence is an aging beech overstory with little or no regeneration to replace the mature trees. Great rhododendron (R. maximum L.) has been increasing in the forests of the Southern Appalachians of the United States, reducing tree regeneration and growth. The BSR and Southern Appalachians bear noteworthy similarities in climate, topography, and the forest flora. Purple-flowered and great rhododendrons also show important similarities in their ecology and the forest vegetation problems they can cause. Current rhododendron-dominated and threatened BSR forests may provide an advanced ecological picture of the forests of the Southern Appalachians in which great rhododendron now thrives. Therefore, new information gained on the ecology and effective and cost-efficient control of purple-flowered rhododendron may significantly improve forest management practices, not only for the current rhododendron-invaded BSR ecosystem, but also for other parts of the world. This dissertation consists of five separate yet related chapters. The first gives relevant literature reviewed for the dissertation. The second chapter focuses on various environmental and disturbance factors that may have shaped the current purple-flowered rhododendron-dominated beech forests of the BSR of Turkey. Chapter 3 assesses the effects of various manual and herbicidal woody control techniques on purple-flowered and yellow-flowered rhododendron in two field experiments in the BSR. The fourth chapter relates a study of uptake and translocation behavior of triclopyr ester and imazapyr in great rhododendron. This information is used to determine the optimum herbicide-surfactant combinations for the greatest active ingredient uptake and root translocation in great rhododendron. The last chapter is a synthesis of the information gained in all of these different experiments.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
38

Silva, Paulo Wagner Lopes da. "O papel da distância em projetos topológicos de redes de distribuição elétrica." Universidade Federal de Alagoas, 2015. http://www.repositorio.ufal.br/handle/riufal/1602.

Full text
Abstract:
This dissertation investigates in which conditions the optimal configuration of an electric power network is a minimum length spanning tree, and in which conditions it is shortest path tree configuration. For this purpose the dissertation, it applies computational optimization mathematical models of an optimal local access network design problem. The focus of the study is the 13.8 kV spacer cable primary radial networks. Applied models seek for the balance betweenfixedcostsandvariablecosts.Savedvaluesfromanoptimalnetworkcouldbeapplied to increase the range of the network and people reached as well. The bibliographic research is compound by three parts: graph theory, local access network optimization models, and distribution network costs. Research methodology includes the choice of the distribution system, determination of fixed and variable costs, choice and implementation of the local access network optimization models, tests in hypothetical and realistic systems by using the CPLEX solver, analysis of the resulting configuration, and construction of graphics to facilitate the results evaluation. It was found that the relationship between fixed costs and variable costs influences the optimal configuration of the distribution network in such a way that a low value of the quotient between fixed costs and variable costs contributes to a shortest path tree. On the other hand, a high quotient between fixed costs and variable costs contributes to a minimum length spanning tree configuration. However, others parameters must be considered to determine the network configuration such as extension, arches demand and quantity of arches.
O presente trabalho visa investigar sob quais condições a configuração ótima de uma rede de distribuição elétrica é uma árvore geradora mínima (AGM) e sob quais é uma árvore de caminhos mínimos (ACM). Utilizando, para isso, modelos matemáticos computacionais de otimização topológica de redes de utilidade pública. As redes de distribuição estudadas foram do tipo aérea radial primária protegida (ARPP) com nível de tensão em 13,8 kV. Os modelos utilizados prezam pelo equilíbrio entre o custo de investimento inicial (fixo) e os custos decorrentes da transferência de energia elétrica (variável). Os valores economizados através de uma configuração ótima da rede podem ser convertidos em investimentos para aumentar o número de pessoas com acesso aos recursos energéticos com eficiência e qualidade. A revisão bibliográfica foi dividida em três partes: teoria dos grafos, modelos de otimização de redes de acesso local e custos de redes de distribuição. A metodologia utilizada compreendeu as seguintes etapas: escolha do tipo de sistema de distribuição, determinação dos custos fixo e variável, escolha e implementação (GAMS) dos modelos, testes com exemplos de redes usando o solver CPLEX, análise das configurações resultantes e elaboração de gráficos para facilitar a avaliação dos resultados. Os resultados mostraram que a relação entre o custo fixo β e o custo variável γ exerce influência determinante na configuração ótima de uma rede de distribuição ARPP. Um valor baixo de β/γ, favorece a ACM. Já valores elevados de β/γ, conduzem a solução para uma AGM. No entanto, essa relação não é o único fator que determina a configuração da rede, outros parâmetros como extensão, demanda dos nós e quantidade de possíveis arcos influenciam de forma significativa na solução apresentada.
APA, Harvard, Vancouver, ISO, and other styles
39

Rivière, Julie. "Evaluation du dispositif de surveillance de la tuberculose bovine dans la faune sauvage en France à l'aide de méthodes épidémiologique, économique et sociologique." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS115/document.

Full text
Abstract:
Les maladies animales émergentes, les maladies zoonotiques et le développement du commerce international ont conduit à une augmentation des besoins en systèmes de surveillance en santé animale performants. Toutefois, le contexte économique actuel conduit à des restrictions budgétaires importantes, induisant une diminution des ressources allouées à la surveillance. Dans ce contexte, l’évaluation régulière des dispositifs de surveillance, sur lesquels sont fondées les décisions sanitaires, est indispensable afin de vérifier leur bon fonctionnement, la qualité des données collectées, et permettre leur amélioration.Notre travail a porté sur l’évaluation d’un dispositif de surveillance complexe, Sylvatub, le dispositif de surveillance de l’infection à Mycobacterium bovis dans la faune sauvage, constitué de plusieurs composantes de surveillance et ciblant plusieurs espèces sauvages. Nous avons appliqué quatre méthodes d’évaluation : (i) une méthode quantitative d’estimation de la sensibilité de la surveillance par arbres de scénarios, (ii) une méthode quantitative d’estimation des coûts de la surveillance, permettant le calcul d’un ratio coût-efficacité, (iii) une méthode semi-quantitative permettant l’étude du fonctionnement général du dispositif et (iv) une méthode qualitative permettant d’investiguer l’acceptabilité de la surveillance. Ces travaux ont permis d’évaluer le dispositif Sylvatub dans son contexte environnemental et économique, en intégrant des facteurs comportementaux et sociaux, et ont permis la formulation de recommandations pour l’évolution du dispositif et son amélioration.Ces travaux ont également permis de souligner les avantages méthodologiques et opérationnels de l’utilisation complémentaire de plusieurs méthodes pour l’évaluation de dispositifs de surveillance complexes et proposent des perspectives méthodologiques pour favoriser l’intégration des méthodes d’évaluation. L’évaluation du dispositif Sylvatub devra être poursuivie et complétée par celle du dispositif de surveillance en élevage bovin afin d’étudier les interconnexions entre les populations domestiques et sauvages dans ce système multi-hôtes particulier
Emerging animal diseases, zoonotic diseases and the development of international trade have led to an increase in the need for efficient animal health surveillance systems. However, the current economic environment led to significant budget cuts, resulting in a reallocation of resources dedicated to surveillance. In this context, regular evaluation of surveillance systems, on which are based the health decisions, is essential to ensure their operation, the quality of the collected data and to allow their improvement.This study focused on the evaluation of a complex surveillance system, the Sylvatub network for the surveillance of Mycobacterium bovis infection in wildlife, which consists of several surveillance components focusing on several wild species. We have used four evaluation methods: (i) a quantitative method to estimate the surveillance sensitivity by scenario trees modelling, (ii) a quantitative method to estimate the surveillance costs, enabling the estimation of a cost-effectiveness ratio, (iii) a semi-quantitative method to estimate the global operation of the system, and (iv) a qualitative method to investigate the acceptability of the surveillance. This study allowed to assess the Sylvatub network in its environmental and economical context, with the integration of behavioral and social factors; and allowed the development of recommendations for the evolution of the surveillance system and its improvement.This study has highlighted the methodological and operational advantages of the complementary use of several methods for the evaluation of complex surveillance systems. It provides methodological perspectives to support the integration of evaluation methods. The assessment of the Sylvatub system should be deepened and complemented by the evaluation of the surveillance system in cattle to explore interconnections between domestic and wild populations in this particular multi-host system
APA, Harvard, Vancouver, ISO, and other styles
40

Ng, Anthony Kwok-Lung. "Risk Assessment of Transformer Fire Protection in a Typical New Zealand High-Rise Building." Thesis, University of Canterbury. Civil Engineering, 2007. http://hdl.handle.net/10092/1223.

Full text
Abstract:
Prescriptively, the requirement of fire safety protection systems for distribution substations is not provided in the compliance document for fire safety to the New Zealand Building Code. Therefore, the New Zealand Fire Service (NZFS) has proposed a list of fire safety protection requirements for distribution substations in a letter, dated 10th July 2002. A review by Nyman [1], has considered the fire safety requirements proposed by the NZFS and discussed the issues with a number of fire engineers over the last three years. Nyman concerned that one of the requirements regarding the four hour fire separation between the distribution substation and the interior spaces of the building may not be necessary when considering the risk exposure to the building occupants in different situations, such as the involvement of the sprinkler systems and the use of transformers with a lower fire hazard. Fire resistance rating (FRR) typically means the time duration for which passive fire protection system, such as fire barriers, fire walls and other fire rated building elements, can maintain its integrity, insulation and stability in a standard fire endurance test. Based on the literature review and discussions with industry experts, it is found that failure of the passive fire protection system in a real fire exposure could potentially occur earlier than the time indicated by the fire resistance rating derived from the standard test depending on the characteristics of the actual fire (heat release rate, fire load density and fire location) and the characteristics of the fire compartment (its geometric, ventilation conditions, opening definition, building services and equipment). Hence, it is known that a higher level of fire safety, such as 4 hour fire rated construction and use of sprinkler system, may significantly improve the fire risk to health of safety of occupants in the building; however, they could never eliminate the risk. This report presents a fire engineering Quantitative Risk Assessment (QRA) on a transformer fire initiating in a distribution substation inside a high-rise residential and commercial mixeduse building. It compares the fire safety protection requirements for distribution substations from the NZFS to other relevant documents worldwide: the regulatory standards in New Zealand, Australia and United States of America, as well as the non-regulatory guidelines from other stakeholders, such as electrical engineering organisation, insurance companies and electricity providers. This report also examines the characteristics of historical data for transformer fires in distribution substations both in New Zealand and United States of America buildings. Reliability of active fire safety protection systems, such as smoke detection systems and sprinkler systems is reviewed in this research. Based on the data analysis results, a fire risk estimate is determined using an Event Tree Analysis (ETA) for a total of 14 scenarios with different fire safety designs and transformer types for a distribution substation in a high-rise residential and commercial mixed-use building. In Scenario 1 to 10 scenarios, different combinations of fire safety systems are evaluated with the same type of transformer, Flammable liquid (mineral oil) insulated transformer. In Scenario 11 to Scenario 14, two particular fire safety designs are selected as a baseline for the analysis of transformer types. Two types of transformer with a low fire hazard are used to replace the flammable liquid (mineral oil) insulated transformer in a distribution substation. These are less flammable liquid (silicone oil) insulated transformers and dry type (dry air) transformers. The entire fire risk estimate is determined using the software package @Risk4.5. The results from the event tree analysis are used in the cost-benefit analysis. The cost-benefit ratios are measured based on the reduced fire risk exposures to the building occupants, with respect to the investment costs of the alternative cases, from its respective base case. The outcomes of the assessment show that the proposed four hour fire separation between the distribution substations and the interior spaces of the building, when no sprinkler systems are provided, is not considered to be the most cost-effective alternative to the life safety of occupants, where the cost-benefit ratio of this scenario is ranked fifth. The most cost-effective alternative is found to be the scenario with 30 minute fire separation and sprinkler system installed. In addition to the findings, replacing a flammable liquid insulated transformer with a less flammable liquid insulated transformer or a dry type transformer is generally considered to be economical alternatives. From the QRA analysis, it is concluded that 3 hour fire separation is considered to be appropriate for distribution substations, containing a flammable liquid insulated transformer and associated equipment, in non-sprinklered buildings. The fire ratings of the separation construction can be reduced to 30 minute FRR if sprinkler system is installed. This conclusion is also in agreement with the requirements of the National Fire Protection Association (NFPA).
APA, Harvard, Vancouver, ISO, and other styles
41

Kitenge, Emile Museu. "Harvesting of invasive woody vegetation (Eucalyptus lehmanii, Leptospermum laevigatum, Acacia cyclops) as energy feedstock in the Cape Agulhas Plain of South Africa." Thesis, Stellenbosch : Stellenbosch University, 2011. http://hdl.handle.net/10019.1/17873.

Full text
Abstract:
Thesis (MFor)--Stellenbosch University, 2011.
ENGLISH ABSTRACT: This study is aimed at testing the possibility of using woody biomass from three invasive woody vegetation types (Spider Gum, Myrtle and Acacia) for production of bioenergy in the Cape Agulhas Plain. Physical recoverability of the woody biomass was studied by means of a semi-mechanized harvesting system to evaluate potential productivity, operational costs and the estimated yield energy gain. The system consisted of five components: manual harvesting, motor-manual harvesting, extraction, chipping and road transport. Data on the system productivity was obtained using activity sampling and time study techniques. Activity sampling was applied on manual and motor-manual harvesting in order to record harvesting time and standard time study techniques were used to obtain time data for extraction, chipping and road transport operations. Findings revealed benefits associated with the utilisation of invasive woody vegetation as energy feedstock. Therefore, the problem of exotic tree species can be dealt with by transforming them into energy feedstock, thus minimising the effect of invasive plants. At the same time essential biomass energy can be produced, while some of the cost of production could be offset by the benefits accruing from the biomass energy. The Acacia site, characterized by larger mature dense trees, had the highest amount of harvested biomass compared to the rest of the vegetation types (i.e. Myrtle and Spider Gum). The overall system productivity was found to be significantly influenced by a low equipment utilisation rate, estimated at 50%. This resulted in low production rates in general. The low supply rate of material to the chipper by the three-wheeled loader (1.5 – 5.3 oven-dry tonne per production machine hour) was found to be a major constraint in the chipping process, especially when considering that the chipper is potentially capable of chipping 4 – 9.4 ODT PMH-1 at the harvesting sites. This resulted in a significant energy balance of 463 GJ between output and input energy of the system. The overall total supply chain system costs based various road transport distances of species ranged from R 322.77 ODT-1 to R 689.76 ODT-1 with an average of R 509 ODT-1. This was found to be costly compare to the case where high machine utilisation rate and optimal productivity are used (average of R 410 ODT-1), biomass recoverability in this field trial had a higher total system cost due to low productivity, resulting from the low equipment utilisation rate applied.
AFRIKAANSE OPSOMMING: Hierdie studie was gemik daarop om die moontlikheid van die gebruik van houtagtige biomassa, afkomstig van uitheemse plantegroei (Bloekom, Mirte en Akasias) op die Agulhasvlakte vir bio-energie te ondersoek. Potensiële produktiwiteit, bedryfskostes en die geskatte energie opbrengs toename is gebruik, om die fisiese opbrengs van houtagtige biomassa van ʼn semi-gemeganiseerde ontginningstelsel te evalueer. Die stelsel het uit vyf komponente bestaan: Handontginning, motor-handontginning, uitsleep, verspandering en padvervoer. Data oor die stelselproduktiwiteit is uit tydstudie en aktiwiteit steekproewe verkry. Aktiwiteit steekproewe is toegepas op hand- en motorhandontgining om ontginingstyd te verkry, terwyl tydstudie standaardtegnieke gebruik is om tyd data vir uitsleep, verspandering en padvervoer werksaamhede te verkry. Bevindings het die voordele met bettrekking tot die gebruik van uitheemse plantegroei as energiebron bevestig. Die uitdaging rondom die verspreiding van uitheemse plantegroei kan dus aangespreek word deur dit as energiebron te benut. Die produksiekoste vir die toegang tot die bruikbare biomassa kan moontlik voorsien word uit die voordele van die gebruik van die energie wat uit die benutting van die biomassa verkry word. Die groter meer volwasse en digte Akasia opstand het die meeste ontginde biomassa gelewer vergeleke met die ander opstande in die studie (d.i. Mirte en Bloekom). Die stelselproduktiwiteit is beduidend beïnvloed deur die lae toerustinggebruik wat minder as 50% beloop het. Dit het ook laer produksievermoë in die algemeen tot gevolg gehad. In die verspandering werksaamheid blyk die lae invoer tempo (1.5 – 5.3 oonddroog ton per produktiewe masjienuur) van die driewiellaaier die beperking op die proses te wees, veral as in ag geneem word dat die verspandering teen 4-9.4 ODT PMH-1 kan geskied. Die resultaat was ʼn beduidende energie balans van 463 GJ tussen uitset- en invoerenergie van die stelsel. Die totale toevoerketting kostes gegrond op verskeie padvervoer afstande van die spesies was tussen R 322.77 ODT-1 tot R 689.76 ODT-1, met ʼn gemiddelde rondom R 509 ODT-1. Die resultaat is duur gevind in vergeleke met gevalle waar hoë masjiengebruik en optimale produktiwiteit (gemiddeld van R 410 ODT-1), moontlik was. Die biomassaherwinning in die studie het ʼn hoër totale stelselkoste gehad veroorsaak deur lae produktiwiteit, wat verwant is aan die laer toerusting gebruikstempo wat verkry is.
APA, Harvard, Vancouver, ISO, and other styles
42

Hůlová, Martina. "Porovnání cen okrasných rostlin zjištěných zjednodušeným a nákladovým způsobem s různou charakteristikou typu zeleně." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2013. http://www.nusl.cz/ntk/nusl-232711.

Full text
Abstract:
The aim of this diploma thesis is to compare prices of ornamental plants founded by simplified and cost method of valuation. Comparison was made on a sample of garden situated in the functional unit with a terraced house and the land built over by this building. Based on the obtained results the influence of different location and age of the trees on their price is evaluated. The thesis also defines basic terms and explains issues which are closely related with valuation of ornamental plants.
APA, Harvard, Vancouver, ISO, and other styles
43

Jayaraman, Sambhavi. "A Structure based Methodology for Retrieving Similar Rasters and Images." University of Cincinnati / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1428048689.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Šoula, Michal. "Ocenění výše škody způsobené pádem stromu na rekreační chatu v Roudné u Nových Hradů." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2016. http://www.nusl.cz/ntk/nusl-241336.

Full text
Abstract:
The aim of this thesis is to determine the amount of indemnity for total damage caused by a falling tree during the gale on a cottage in Roudné near Nové Hrady. The theoretical part deals with basic insurance terminology and basic terminology and methods used in property evaluation. The practical part describes the original condition of the property and its evaluation before the insured event using a cost method and complying with the evaluation regulation. Furthermore, this thesis deals with determination of the costs through the itemized budget for bringing the property to its original state. The conclusion deals with the impact of reconstruction on the extent of indemnity.
APA, Harvard, Vancouver, ISO, and other styles
45

Baptiste, Julien. "Problèmes numériques en mathématiques financières et en stratégies de trading." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLED009.

Full text
Abstract:
Le but de cette thèse CIFRE est de construire un portefeuille de stratégies de trading algorithmique intraday. Au lieu de considérer les prix comme une fonction du temps et d'un aléa généralement modélisé par un mouvement brownien, notre approche consiste à identifier les principaux signaux auxquels sont sensibles les donneurs d'ordres dans leurs prises de décision puis alors de proposer un modèle de prix afin de construire des stratégies dynamiques d'allocation de portefeuille. Dans une seconde partie plus académique, nous présentons des travaux de pricing d'options européennes et asiatiques
The aim of this CIFRE thesis is to build a portfolio of intraday algorithmic trading strategies. Instead of considering stock prices as a function of time and a brownian motion, our approach is to identify the main signals affecting market participants when they operate on the market so we can set up a prices model and then build dynamical strategies for portfolio allocation. In a second part, we introduce several works dealing with asian and european option pricing
APA, Harvard, Vancouver, ISO, and other styles
46

Ouali, Abdelkader. "Méthodes hybrides parallèles pour la résolution de problèmes d'optimisation combinatoire : application au clustering sous contraintes." Thesis, Normandie, 2017. http://www.theses.fr/2017NORMC215/document.

Full text
Abstract:
Les problèmes d’optimisation combinatoire sont devenus la cible de nombreuses recherches scientifiques pour leur importance dans la résolution de problèmes académiques et de problèmes réels rencontrés dans le domaine de l’ingénierie et dans l’industrie. La résolution de ces problèmes par des méthodes exactes ne peut être envisagée à cause des délais de traitement souvent exorbitants que nécessiteraient ces méthodes pour atteindre la (les) solution(s) optimale(s). Dans cette thèse, nous nous sommes intéressés au contexte algorithmique de résolution des problèmes combinatoires, et au contexte de modélisation de ces problèmes. Au niveau algorithmique, nous avons appréhendé les méthodes hybrides qui excellent par leur capacité à faire coopérer les méthodes exactes et les méthodes approchées afin de produire rapidement des solutions. Au niveau modélisation, nous avons travaillé sur la spécification et la résolution exacte des problématiques complexes de fouille des ensembles de motifs en étudiant tout particulièrement le passage à l’échelle sur des bases de données de grande taille. D'une part, nous avons proposé une première parallélisation de l'algorithme DGVNS, appelée CPDGVNS, qui explore en parallèle les différents clusters fournis par la décomposition arborescente en partageant la meilleure solution trouvée sur un modèle maître-travailleur. Deux autres stratégies, appelées RADGVNS et RSDGVNS, ont été proposées qui améliorent la fréquence d'échange des solutions intermédiaires entre les différents processus. Les expérimentations effectuées sur des problèmes combinatoires difficiles montrent l'adéquation et l'efficacité de nos méthodes parallèles. D'autre part, nous avons proposé une approche hybride combinant à la fois les techniques de programmation linéaire en nombres entiers (PLNE) et la fouille de motifs. Notre approche est complète et tire profit du cadre général de la PLNE (en procurant un haut niveau de flexibilité et d’expressivité) et des heuristiques spécialisées pour l’exploration et l’extraction de données (pour améliorer les temps de calcul). Outre le cadre général de l’extraction des ensembles de motifs, nous avons étudié plus particulièrement deux problèmes : le clustering conceptuel et le problème de tuilage (tiling). Les expérimentations menées ont montré l’apport de notre proposition par rapport aux approches à base de contraintes et aux heuristiques spécialisées
Combinatorial optimization problems have become the target of many scientific researches for their importance in solving academic problems and real problems encountered in the field of engineering and industry. Solving these problems by exact methods is often intractable because of the exorbitant time processing that these methods would require to reach the optimal solution(s). In this thesis, we were interested in the algorithmic context of solving combinatorial problems, and the modeling context of these problems. At the algorithmic level, we have explored the hybrid methods which excel in their ability to cooperate exact methods and approximate methods in order to produce rapidly solutions of best quality. At the modeling level, we worked on the specification and the exact resolution of complex problems in pattern set mining, in particular, by studying scaling issues in large databases. On the one hand, we proposed a first parallelization of the DGVNS algorithm, called CPDGVNS, which explores in parallel the different clusters of the tree decomposition by sharing the best overall solution on a master-worker model. Two other strategies, called RADGVNS and RSDGVNS, have been proposed which improve the frequency of exchanging intermediate solutions between the different processes. Experiments carried out on difficult combinatorial problems show the effectiveness of our parallel methods. On the other hand, we proposed a hybrid approach combining techniques of both Integer Linear Programming (ILP) and pattern mining. Our approach is comprehensive and takes advantage of the general ILP framework (by providing a high level of flexibility and expressiveness) and specialized heuristics for data mining (to improve computing time). In addition to the general framework for the pattern set mining, two problems were studied: conceptual clustering and the tiling problem. The experiments carried out showed the contribution of our proposition in relation to constraint-based approaches and specialized heuristics
APA, Harvard, Vancouver, ISO, and other styles
47

Greenberg, Anita Warner. "Financial adequacy and the true cost of curriculum in a central Texas school district." Thesis, 2006. http://hdl.handle.net/2152/2873.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Klaeboe, R., K. Veisten, Renterghem T. Van, Maercke D. Van, T. Leissing, and Hadj Benkreira. "Cost-benefit analysis of tree belt configurations." 2013. http://hdl.handle.net/10454/9677.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Tsou, Yu-Lin, and 鄒侑霖. "Annotation Cost-sensitive Active Learning by Tree Sampling." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/x87fq7.

Full text
Abstract:
碩士
國立臺灣大學
資訊工程學研究所
105
Active learning is an important machine learning setup for reducing the labelling effort of humans. Although most existing works are based on a simple assumption that each labelling query has the same annotation cost, the assumption may not be realistic. That is, the annotation costs may actually vary between data instances. In addition, the costs may be unknown before making the query. Traditional active learning algorithms cannot deal with such a realistic scenario. In this work, we study annotation-cost-sensitive active learning algorithms, which need to estimate the utility and cost of each query simultaneously. We propose a novel algorithm, the cost-sensitive tree sampling(CSTS) algorithm, that conducts the two estimation tasks together and solve it with a tree-structured model motivated from hierarchical sampling, a famous algorithm for traditional active learning. By combining multiple tree-structured models, an extension of CSTS, the cost-sensitive forest sampling(CSFS) algorithm, is also proposed and discussed. Extensive experimental results using data sets with simulated and true annotation costs validate that the proposed methods are generally superior to other annotation cost-sensitive algorithms.
APA, Harvard, Vancouver, ISO, and other styles
50

Barry, William Ryan. "The true impact of late deliverables at the construction site." Thesis, 2014. http://hdl.handle.net/2152/25835.

Full text
Abstract:
Given that a construction site is both temporary and unique, the outcome of every construction project is dependent upon having all of the proper resources delivered to the site at the appropriate time. Although this is common knowledge in the construction industry, late deliverables to the site continue to be a major impediment to project success. In order to better understand late deliverables and their impacts on performance, the Construction Industry Institute, in collaboration with the Construction Users Roundtable, commissioned Research Team (RT) 300 to investigate how various types of late deliverables affect the cost, schedule, quality, safety, and organizational performance of industrial construction projects. Using case studies, industry surveys and questionnaires, existing literature, and internal team expertise, RT 300 developed two research thrusts: investigate how the industry understands, manages, and is affected by late deliverables, and document and give visibility to the true risks and impacts associated with late deliverables. When examining how late deliverables affect the construction industry, RT 300 found that (1) there is limited understanding of the full range of late deliverables and their far-reaching impacts, (2) the most common late deliverables tend to have the most severe impacts on projects, (3) project teams are typically reactionary when managing late deliverables, (4) project stakeholders have varying perceptions of the risks and impacts associated with late deliverables, and (5) proactively managing late deliverables and impacts is key for improvement in the industry. With these findings and the second research thrust in mind, RT 300 created a database tool, the Late Deliverable Risk Catalog (LDRC), to document common types of late deliverables, give visibility to the full range of impacts, and help project teams recognize risks, improve alignment, and proactively manage late deliverables and mitigate the impacts. RT 300 has also developed implementation recommendations for the LDRC, prevention recommendations for the highest risk deliverables, and lessons learned in managing late deliverables. Altogether, this research can help improve the understanding of late deliverables and resulting impacts and risks in order to improve project delivery, productivity, and predictability as well as enhance safety, quality, and organizational and individual performance.
text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography