To see the other types of publications on this topic, follow the link: Generation of decision.

Dissertations / Theses on the topic 'Generation of decision'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Generation of decision.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Alpagut, Merih Ayse. "A Decision Support System For Electricity Generation Investment." Phd thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612211/index.pdf.

Full text
Abstract:
In the recent years, ongoing debates in the mineral sector has shown that efficient use of natural resources is of vital importance as the use of minerals is essential for modern living. Especially, in the context of sustainable development, it is required that mineral resources should be exploited to maximize the contribution to the well being of current generation without depriving the potential for future generations to meet their own needs. The aim of this thesis is to develop a decision support system using system dynamics methodology where
APA, Harvard, Vancouver, ISO, and other styles
2

Fleishman, Lauren Alyse. "Public Decision-making about Low-Carbon Electricity Generation." Research Showcase @ CMU, 2011. http://repository.cmu.edu/dissertations/75.

Full text
Abstract:
To mitigate the effects of climate change, the U.S. will need a widespread deployment of energy efficiency efforts and low-carbon electricity generating technologies including nuclear, wind, natural gas, and coal with carbon capture and sequestration (CCS), technologies that separate CO2 emissions from the flue-gas of fossil fuel power plants and sequester it in deep underground geological formations. The feasibility of this strategy will partially depend on public acceptance of these technologies as part of a national energy policy. To varying degrees, public misconceptions and knowledge gaps exist for each of these low-carbon technologies. Thus, people need balanced and comparative information to make informed decisions about which low-carbon electricity technologies and portfolios to support. In this thesis, we describe paper-based and computer-based communications presenting multi-attribute descriptions about the costs, benefits, risks and limitations of ten electricity technologies and low-carbon portfolios composed of those technologies. Participants are first asked to rank the technologies under a hypothetical scenario where future power plant construction in Pennsylvania must meet a CO2 emissions constraint. Next, participants attend small group meetings where they rank seven portfolios that meet a specific CO2 emission limit. In a subsequent study, participants instead construct their own low-carbon portfolio using a computer decision tool that restricts portfolio designs to realistic technology combinations. We find that our participants could understand and consistently use our communications to help inform their decisions about low-carbon technologies. We conclude that our informed participants preferred energy efficiency, nuclear, and coal (gasification) with CCS, as well as diverse portfolios including these technologies. The thesis continues with a retrospective view for the value of research that elicits general public opinions of CCS and that develops communications to educate people about low-carbon electricity generation. In the latter, we find that the knowledge of science teachers may be insufficient to correct common public misconceptions about low-carbon technologies among their students. Thus, the communications we developed for this thesis could also benefit science teachers. Overall, we conclude that the computer tool, supplemental materials and procedure developed for this thesis may be valuable for educating the general public about low-carbon electricity generation.
APA, Harvard, Vancouver, ISO, and other styles
3

Achrenius, William, and Törnkvist Martin Bergman. "GRAPH GENERATION ALGORITHMS FOR THE GRADE DECISION CANVAS." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-41200.

Full text
Abstract:
Development in the field of software architecture, from the early days in the mid-80’s, has been significant. From purely technical descriptions to decision based architectural knowledge, software architecture has seen fundamental changes to its methodologies and techniques. Architectural knowledge is a resource that is managed and stored by companies, this resource is valuable because it can be reused and analysed to improve future development. Companies today are interested in the reasoning behind the software architecture. This reasoning is mainly formulated through the architectural decisions made during development. For architectural decisions to be easier to analyse they need to be stored in a way that enables use of common analytical tools so that comparisons between decisions are consistent and relevant. Additionally, it is also important to have enough data, which leads us to the problem that, preferably, all the individual architectural knowledge cases must be structured and stored. To do this we present a tool that uses graph generation algorithms to generate architectural knowledge as graphs based on an architectural decision canvas called GRADE. This enables multiple decision cases to be encoded through graphs that can be used to analyse relationships and balances between different architectural knowledge elements represented through nodes and edges within a graph.
APA, Harvard, Vancouver, ISO, and other styles
4

McKinley, Nathan D. "A Decision Theoretic Approach to Natural Language Generation." Case Western Reserve University School of Graduate Studies / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=case1386188714.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Arikenbi, Temitayo. "Decision Support for Multi-Criteria Energy Generation Problem." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-6073.

Full text
Abstract:
In this study, an attempt is made to apply Decision Support Systems (DSS) in planning for the expansion of energy generation infrastructure in Nigeria. There is an increasing demand for energy in that country, and the study will try to show that DSS modelling, using A Mathematical Programming Language (AMPL) as the modelling tool, can offer satisficing results which would be a good decision support resource for motivating how to expend investment for energy generation.<br>+46707267798
APA, Harvard, Vancouver, ISO, and other styles
6

Wingfield, James. "Approaches to test set generation using binary decision diagrams." Thesis, Texas A&M University, 2003. http://hdl.handle.net/1969.1/20.

Full text
Abstract:
This research pursues the use of powerful BDD-based functional circuit analysis to evaluate some approaches to test set generation. Functional representations of the circuit allow the measurement of information about faults that is not directly available through circuit simulation methods, such as probability of random detection and test-space overlap between faults. I have created a software tool that performs experiments to make such measurements and augments existing test generation strategies with this new information. Using this tool, I explored the relationship of fault model difficulty to test set length through fortuitous detection, and I experimented with the application of function-based methods to help reconcile the traditionally opposed goals of making test sets that are both smaller and more effective.
APA, Harvard, Vancouver, ISO, and other styles
7

Cobb, Bradley Douglas. "Using ordered partial decision diagrams for manufacture test generation." Thesis, Texas A&M University, 2003. http://hdl.handle.net/1969.1/498.

Full text
Abstract:
Because of limited tester time and memory, a primary goal of digital circuit manufacture test generation is to create compact test sets. Test generation programs that use Ordered Binary Decision Diagrams (OBDDs) as their primary functional representation excel at this task. Unfortunately, the use of OBDDs limits the application of these test generation programs to small circuits. This is because the size of the OBDD used to represent a function can be exponential in the number of the function's switching variables. Working with these functions can cause OBDD-based programs to exceed acceptable time and memory limits. This research proposes using Ordered Partial Decision Diagrams (OPDDs) instead as the primary functional representation for test generation systems. By limiting the number of vertices allowed in a single OPDD, complex functions can be partially represented in order to save time and memory. An OPDD-based test generation system is developed and techniques which improve its performance are evaluated on a small benchmark circuit. The new system is then demonstrated on larger and more complex circuits than its OBDD-based counterpart allows.
APA, Harvard, Vancouver, ISO, and other styles
8

Grobys, Sebastian. "Random generation of Bayesian Networks and Markov Decision Processes." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81336.

Full text
Abstract:
Markov Decision Processes (MDPs) and Bayesian Networks (BNs) are two very different but equally prominent frameworks for modeling stochastic environments. As empirical testing and the use of randomized search algorithms gain increasing importance in machine learning and artificial intelligence, there exists a growing need for random models; in particular MDPs and BNs. This thesis aims at providing a focused review of the field of random model generation. We begin by motivating in detail the need for rigorous random MDP and BN generation and survey relevant past research efforts. We outline the different problems involved in generating the models and then present our main contribution, a random graph algorithm specifically designed for the uniform generation of graph structures underlying random MDPs and BNs.
APA, Harvard, Vancouver, ISO, and other styles
9

Gundavarapu, Madhavi. "RuleGen – A Rule Generation Application Using Multiset Decision Tables." University of Akron / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=akron1140111266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tan, Kian Moh Terence. "Tactical plan generation software for maritime interdiction using conceptual blending theory." Thesis, Monterey, Calif. : Naval Postgraduate School, 2007. http://bosun.nps.edu/uhtbin/hyperion-image.exe/07Dec%5FTan%5FKian.pdf.

Full text
Abstract:
Thesis (M.S. in Modeling, Virtual Environments, and Simulation (MOVES))--Naval Postgraduate School, December 2007.<br>Thesis Advisor(s): Hiles, John. "December 2007." Description based on title screen as viewed on January 18, 2008. Includes bibliographical references (p. 87-91). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
11

Raik, Jaan. "Hierarchical test generation for digital circuits represented by decision diagrams /." Tallinn : TTU Press, 2001. http://www.loc.gov/catdir/toc/fy0611/2006530982.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Molise, Puseletso Bridget. "Consumer decision-making styles for Zambian generation X urban females." Thesis, Stellenbosch : Stellenbosch University, 2015. http://hdl.handle.net/10019.1/97348.

Full text
Abstract:
Thesis (MBA)--Stellenbosch University, 2015.<br>ENGLISH ABSTRACT: The purpose of the research was to investigate the decision-making styles of urban Zambian Generation X females shopping for apparel products. The research made use of a Consumer Styles Inventory (CSI) scale developed by Sproles and Kendall (1986) to measure the characteristics of various shopping styles. Out of 300 self-administered questionnaires distributed, 180 were used for data analysis. The Cronbach Alpha coefficients confirmed the reliability of the CSI scale on 7 out of 8 decision-making styles that could be associated with the consumers under review. The study then used Analysis of Variance (ANOVA) to establish the variation between the different decision-making styles. The findings revealed that the decision-making styles of quality consciousness and a recreational shopping orientation are highly correlated. The research findings have policy implications and recommendations for the development of marketing strategies and further research have been made.<br>AFRIKAANSE OPSOMMING: Die doel van die navorsing was om ondersoek te doen na die besluitnemingstyle van stedelike Zambiese vroue van Generasie X wanneer hulle klereprodukte koop. Die navorsing het gebruik gemaak van die inventaris van verbruikerstyle wat deur Sproles en Kendall (1986) ontwikkel is, ’n skaal wat in Engels as die Consumer Styles Inventory (CSI) bekendstaan, om die eienskappe van verskillende inkopiestyle te meet. Uit die 300 self-geadministreerde vraelyste wat versprei is, is 180 vir die ontleding van die data gebruik. The Cronbach Alpha-koëffisiënte kon die betroubaarheid van die CSI-skaal bevestig op 7 uit die 8 besluitnemingstyle wat geassosieer kon word met die verbruikers wat ondersoek is. Die navorsingstudie het daarna van variansieanalise gebruik gemaak om die variasie tussen die verskillende besluitnemingstyle te bepaal. Die bevindings het onthul dat die besluitnemingstyle van gehaltebewustheid en inkopies wat as rekreasie beskou word, hoogs gekorreleerd is. Die navorsingsbevindinge het implikasies vir beleid en aanbevelings is gedoen vir die ontwikkeling van bemarkingstrategieë en vir verdere navorsing.
APA, Harvard, Vancouver, ISO, and other styles
13

Marino, Carlos Antonio. "Optimization and decision making under uncertainty for distributed generation technologies." Thesis, Mississippi State University, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10242497.

Full text
Abstract:
<p> This dissertation studies two important models in the field of the distributed generation technologies to provide resiliency to the electric power distribution system. In the first part of the dissertation, we study the impact of assessing a Combined Cooling Heating Power system (CCHP) on the optimization and management of an on-site energy system under stochastic settings. These mathematical models propose a scalable stochastic decision model for large-scale microgrid operation formulated as a two-stage stochastic linear programming model. The model is solved enhanced algorithm strategies for Benders decomposition are introduced to find an optimal solution for larger instances efficiently. Some observations are made with different capacities of the power grid, dynamic pricing mechanisms with various levels of uncertainty, and sizes of power generation units. In the second part of the dissertation, we study a mathematical model that designs a Microgrid (MG) that integrates conventional fuel based generating (FBG) units, renewable sources of energy, distributed energy storage (DES) units, and electricity demand response. Curtailment of renewable resources generation during the MG operation affects the long-term revenues expected and increases the greenhouses emission. Considering the variability of renewable resources, researchers should pay more attention to scalable stochastic models for MG for multiple nodes. This study bridges the research gap by developing a scalable chance-constrained two-stage stochastic program to ensure that a significant portion of the renewable resource power output at each operating hour will be utilized. Finally, some managerial insights are drawn into the operation performance of the Combined Cooling Heating Power and a Microgrid.</p>
APA, Harvard, Vancouver, ISO, and other styles
14

Ogunmakin, Gbolabo. "Automated vision-based generation of event statistics for decision support." Diss., Georgia Institute of Technology, 2016. http://hdl.handle.net/1853/55023.

Full text
Abstract:
Many tasks require surveillance and analysis in order to make decisions regarding the next course of action. The people responsible for these tasks are usually concerned with any event that affects their bottom-line. Traditionally, human operators have had to either actively man a set of video displays to determine if specific events were occurring or manually review hours of collected video data to see if a specific event occurred. Actively monitoring video stream or manually reviewing and analyzing the data collected, however, is a tedious and long process which is prone to errors due to biases and inattention. Automatically processing and analyzing the video provides an alternate way of getting more accurate results because it can reduce the likelihood of missing important events and the human factors that lead to decreased efficiency. The thesis aims to contribute to the area of using computer vision as a decision support tool by integrating detector, tracker, re-identification, activity status estimation, and event processor modules to generate the necessary event statistics needed by a human operator. The contribution of this thesis is a system that uses feedback from each of the modules to provide better target detection, and tracking results for event statistics generation over an extended period of time. To demonstrate the efficacy of the proposed system, it is first used to generate event statistics that measure productivity on multiple construction work sites. The versatility of the proposed system is also demonstrated in an indoor assisted living environment by using it to determine how much of an influence a technology intervention had on promoting interactions amongst older adults in a shared space.
APA, Harvard, Vancouver, ISO, and other styles
15

Nkansah-Gyekye, Yaw. "An intelligent vertical handoff decision algorithm in next generation wireless networks." Thesis, University of the Western Cape, 2010. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_2726_1307443785.

Full text
Abstract:
<p>The objective of the thesis research is to design such vertical handoff decision algorithms in order for mobile field workers and other mobile users equipped with contemporary multimode mobile devices to communicate seamlessly in the NGWN. In order to tackle this research objective, we used fuzzy logic and fuzzy inference systems to design a suitable handoff initiation algorithm that can handle imprecision and uncertainties in data and process multiple vertical handoff initiation parameters (criteria)<br>used the fuzzy multiple attributes decision making method and context awareness to design a suitable access network selection function that can handle a tradeoff among many handoff metrics including quality of service requirements (such as network conditions and system performance), mobile terminal conditions, power requirements, application types, user preferences, and a price model<br>used genetic algorithms and simulated annealing to optimise the access network selection function in order to dynamically select the optimal available access network for handoff<br>and we focused in particular on an interesting use case: vertical handoff decision between mobile WiMAX and UMTS access networks. The implementation of our handoff decision algorithm will provide a network selection mechanism to help mobile users select the best wireless access network among all available wireless access networks, that is, one that provides always best connected services to users.</p>
APA, Harvard, Vancouver, ISO, and other styles
16

Suss, Joel. "Using a prediction and option generation paradigm to understand decision making." Thesis, Michigan Technological University, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=3565347.

Full text
Abstract:
<p> In many complex and dynamic domains, the ability to generate and then select the appropriate course of action is based on the decision maker's "reading" of the situation--in other words, their ability to assess the situation and predict how it will evolve over the next few seconds. Current theories regarding option generation during the situation assessment and response phases of decision making offer contrasting views on the cognitive mechanisms that support superior performance. The Recognition-Primed Decision-making model (RPD; Klein, 1989) and Take-The-First heuristic (TTF; Johnson &amp; Raab, 2003) suggest that superior decisions are made by generating few options, and then selecting the first option as the final one. Long-Term Working Memory theory (LTWM; Ericsson &amp; Kintsch, 1995), on the other hand, posits that skilled decision makers construct rich, detailed situation models, and that as a result, skilled performers should have the ability to generate more of the available task-relevant options. </p><p> The main goal of this dissertation was to use these theories about option generation as a way to further the understanding of how police officers anticipate a perpetrator's actions, and make decisions about how to respond, during dynamic law enforcement situations. An additional goal was to gather information that can be used, in the future, to design training based on the anticipation skills, decision strategies, and processes of experienced officers. Two studies were conducted to achieve these goals. </p><p> Study 1 identified video-based law enforcement scenarios that could be used to discriminate between experienced and less-experienced police officers, in terms of their ability to anticipate the outcome. The discriminating scenarios were used as the stimuli in Study 2; 23 experienced and 26 less-experienced police officers observed temporally-occluded versions of the scenarios, and then completed assessment and response option-generation tasks. </p><p> The results provided mixed support for the nature of option generation in these situations. Consistent with RPD and TTF, participants typically selected the first-generated option as their final one, and did so during both the assessment and response phases of decision making. Consistent with LTWM theory, participants--regardless of experience level--generated more task-relevant assessment options than task-irrelevant options. However, an expected interaction between experience level and option-relevance was not observed. </p><p> Collectively, the two studies provide a deeper understanding of how police officers make decisions in dynamic situations. The methods developed and employed in the studies can be used to investigate anticipation and decision making in other critical domains (e.g., nursing, military). The results are discussed in relation to how they can inform future studies of option-generation performance, and how they could be applied to develop training for law enforcement officers. </p>
APA, Harvard, Vancouver, ISO, and other styles
17

Tawil, Rami. "A distributed vertical handover decision for the fourth generation wireless networks." Paris 6, 2009. http://www.theses.fr/2009PA066113.

Full text
Abstract:
Dans cette thèse nous proposons une décision de handover vertical distribuée (DVHD) pour la quatrième generation des réseaux sans-fils, le DVHD tend à fournir des services continus pour les noeuds mobiles. La quatrième génération de réseau sans fil (FGWN) se compose de réseaux hétérogènes, telles que: Universal Mobile Telecommunication System (UMTS), WiFi et WiMax. Ces technologies fournissent aux utilisateurs une gamme de services qui peuvent être obtenues en se déplaçant entre les réseaux. La question de la mobilité est l'un des intérêts majeurs pour la FGWN, pendant celui-ci, un utilisateur mobile peut subir desévènements de transfert (handover). Une des questions majeures dans le FGWN est la prise de décision pendant le déplacement du mobile. Le Vertical Handoff décision (VHD) est la tâche avec lequel le un décideur de handover choisit le réseau auxquel il va se connecter. La solution DVHD diminue le délai de handover et la consommation d'énergie de la part du nœud mobile. L'idée principale du système est de grouper les réseaux en clusters virtuels, alors si le noeud mobile entre dans la zone de couverture d'un cluster, il communiquera avec le premier réseau disponible pour récupérer des informations sur son environnement, et de prendre la bonne décision. Deux autres extensions sont ajouté le S-DVHD et le T-DVHD qui sert à securiser les communication entre les entités et de créer des relations de confiance entre ces entités
APA, Harvard, Vancouver, ISO, and other styles
18

Fry, John, T. Galla, and J. M. Binner. "Quantitative decision-making rules for the next generation of smarter evacuations." Springer, 2015. http://hdl.handle.net/10454/17563.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Steyn, Grove. "Governance, finance and investment : decision making and risk in the electric power sector." Thesis, University of Sussex, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.390926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Ganapathy, Subhashini. "HUMAN-CENTERED TIME-PRESSURED DECISION MAKING IN DYNAMIC COMPLEX SYSTEMS." Wright State University / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=wright1152229142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Tandberg, Caroline, and Signy Elde Vefring. "The linear decision rule approach applied to the hydrothermal generation planning problem." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for industriell økonomi og teknologiledelse, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-20963.

Full text
Abstract:
We use the linear decision rule approach to develop a model for a stochastic multi-stage generation planning problem in the Nordic region. By developing both the primal and the dual versions of the program, the loss of optimality incurred by the linear decision rule approach can be estimated. Uncertain parameters take values in an uncertainty set defined by upper and lower bounds. Alternative modelling methods for stochastic problems of comparable size and structure either suffer from the curse of dimensionality, or have to rely on unrealistic simplifying assumptions to achieve tractability. We show that the linear decision rule approach gives a good trade-off between tractability and accuracy for a stochastic generation planning problem.
APA, Harvard, Vancouver, ISO, and other styles
22

Di, Domenica Nico. "Stochastic programming and scenario generation : decision modelling simulation and information systems perspective." Thesis, Brunel University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.415936.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

LACHTERMACHER, LUANA. "USING DECISION TABLES TO AUTOMATE THE GENERATION AND EXECUTION OF TEST CASES." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=16193@1.

Full text
Abstract:
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR<br>Teste é uma etapa muito importante no desenvolvimento de software. No entanto, essa área ainda carece de ferramentas mais eficazes e com um grau de automação maior e mais abrangente do que o atualmente disponível. Uma boa parte das técnicas de geração de casos de teste utiliza explicitamente ou implicitamente tabelas de decisão como instrumento intermediário na geração dos casos de teste específicos. Essa dissertação tem por objetivo desenvolver um processo semi-automático de geração de suítes de teste e que inicia em tabelas de decisão. As suítes geradas devem ser adequadas a ferramentas de execução dos testes totalmente automáticas. Para atingir esse objetivo foram implementados: (i) um editor de tabelas de decisão, (ii) um gerador automático de casos de teste e (iii) um gerador de scripts de teste para o framework FEST. A seguir foram avaliados os benefícios que esse conjunto de ferramentas pode trazer ao processo de automação dos testes, tanto no planejamento (geração de casos de testes valorados a partir dos casos de teste semânticos) quanto na execução dos casos de teste. A avaliação foi baseada em uma série de exemplos envolvendo elementos específicos de interfaces humanas, e, também, na aplicação a um software real.<br>Testing is a very important area in the software development. However, this area needs more effective tools with a higher level of automation with are more comprehensive than the tools available today. A large part of the techniques that do generation of test cases use explicity or implicitly decision tables as an auxiliary tool. This dissertation aims to develop a semi-automatic process that has as outputs tests suites that were generated using decision tables. These suites must be appropriate for the test automation tools. To achieve these goals were implemented: (i) an decision table editor, (ii) a automatic generator case test, and (iii) an automatic test scripts generator of the FEST framework. After it was necessary evaluate the benefits that these tools could bring for the test area, both in the planning (generation of valued test cases from semantic test cases) and in the execution of test cases. The evaluation was based on a series of examples involving specific elements of human interfaces, and also in application to real software.
APA, Harvard, Vancouver, ISO, and other styles
24

Nelson, Patricia Clendenning 1958. "THE IMPACT OF SOCIAL CUING ON IDEA GENERATION." Thesis, The University of Arizona, 1987. http://hdl.handle.net/10150/276416.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Kumar, Amit. "Generation of compact test sets and a design for the generation of tests with low switching activity." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/1476.

Full text
Abstract:
Test generation procedures for large VLSI designs are required to achieve close to 100% fault coverage using a small number of tests. They also must accommodate on-chip test compression circuits which are widely used in modern designs. To obtain test sets with small sizes one could use extra hardware such as test points or use software techniques. An important aspects impacting test generation is the number of specified positions, which facilitate the encoding of test cubes when using test compression logic. Fortuitous detection or generation of tests such that they facilitate detection of yet not targeted faults, is also an important goal for test generation procedures. At first, we consider the generation of compact test sets for designs using on-chip test compression logic. We introduce two new measures to guide automatic test generation procedures (ATPGs) to balance between these two contradictory requirements of fortuitous detection and number of specifications. One of the new measures is meant to facilitate detection of yet undetected faults, and the value of the measures is periodically updated. The second measure reduces the number of specified positions, which is crucial when using high compression. Additionally, we introduce a way to randomly choose between the two measures. We also propose an ATPG methodology tailored for BIST ready designs with X-bounding logic and test points. X-bounding and test points used to have a significant impact on test data compression by reducing the number of specified positions. We propose a new ATPG guidance mechanism that balances between reduced specifications in BIST ready designs, and also facilitates detection of undetected faults. We also found that compact test generation for BIST ready designs is influenced by the order in which faults are targeted, and we proposed a new fault ordering technique based on fault location in a FFR. Transition faults are difficult to test and often result in longer test lengths, we propose a new fault ordering technique based on test enumeration, this ordering technique and a new guidance approach was also proposed for transition faults. Test set sizes were reduced significantly for both stuck-at and transition fault models. In addition to reducing data volume, test time, and test pin counts, the test compression schemes have been used successfully to limit test power dissipation. Indisputably, toggling of scan cells in scan chains that are universally used to facilitate testing of industrial designs can consume much more power than a circuit is rated for. Balancing test set sizes against the power consumption in a given design is therefore a challenge. We propose a new Design for Test (DFT) scheme that deploys an on-chip power-aware test data decompressor, the corresponding test cube encoding method, and a compression-constrained ATPG that allows loading scan chains with patterns having low transition counts, while encoding a significant number of specified bits produced by ATPG in a compression-friendly manner. Moreover, the new scheme avoids periods of elevated toggling in scan chains and reduces scan unload switching activity due to unique test stimuli produced by the new technique, leading to a significantly reduced power envelope for the entire circuit under test.
APA, Harvard, Vancouver, ISO, and other styles
26

Martinov, Sonja. "Multi-Actor Multi-Criteria Decision Analysis of EU Policies for First Generation Biofuels." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-254952.

Full text
Abstract:
In this paper multi-actor multi-criteria decision aid methodology are developed to examine the impacts of EU policies related to first generation biofuels on identified key stakeholders. The thesis focuses on the integration of relevant qualitative and quantitative criteria defined by key stakeholders into one comprehensive evaluation process, to serve as a decision support tool for decision makers. Weight allocation of the defined criteria will be assessed by using the Analytic Hierarchy Process (AHP) framework, where the multi-criteria decision-aid method PROMETHEE II is used to rank relevant policy alternatives based on the information provided. In the end, results will help the decision makers to identify the impacts of different EU policy alternatives on each stakeholder group.
APA, Harvard, Vancouver, ISO, and other styles
27

Solomon, Jean-Paul. "Transitions into higher education: educational decision-making of coloured first-generation university students." Thesis, University of Cape Town, 2013. http://hdl.handle.net/11427/3852.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Mazzocco, Thomas. "Toward a novel predictive analysis framework for new-generation clinical decision support systems." Thesis, University of Stirling, 2014. http://hdl.handle.net/1893/21684.

Full text
Abstract:
The idea of developing automated tools able to deal with the complexity of clinical information processing dates back to the late 60s: since then, there has been scope for improving medical care due to the rapid growth of medical knowledge, and the need to explore new ways of delivering this due to the shortage of physicians. Clinical decision support systems (CDSS) are able to aid in the acquisition of patient data and to suggest appropriate decisions on the basis of the data thus acquired. Many improvements are envisaged due to the adoption of such systems including: reduction of costs by faster diagnosis, reduction of unnecessary examinations, reduction of risk of adverse events and medication errors, increase in the available time for direct patient care, improved medications and examination prescriptions, improved patient satisfaction, and better compliance to gold-standard up-to-date clinical pathways and guidelines. Logistic regression is a widely used algorithm which frequently appears in medical literature for building clinical decision support systems: however, published studies frequently have not followed commonly recommended procedures for using logistic regression and substantial shortcomings in the reporting of logistic regression results have been noted. Published literature has often accepted conclusions from studies which have not addressed the appropriateness and accuracy of the statistical analyses and other methodological issues, leading to design flaws in those models and to possible inconsistencies in the novel clinical knowledge based on such results. The main objective of this interdisciplinary work is to design a sound framework for the development of clinical decision support systems. We propose a framework that supports the proper development of such systems, and in particular the underlying predictive models, identifying best practices for each stage of the model’s development. This framework is composed of a number of subsequent stages: 1) dataset preparation insures that appropriate variables are presented to the model in a consistent format, 2) the model construction stage builds the actual regression (or logistic regression) model determining its coefficients and selecting statistically significant variables; this phase is generally preceded by a pre-modelling stage during which model functional forms are hypothesized based on a priori knowledge 3) the further model validation stage investigates whether the model could suffer from overfitting, i.e., the model has a good accuracy on training data but significantly lower accuracy on unseen data, 4) the evaluation stage gives a measure of the predictive power of the model (making use of the ROC curve, which allows to evaluate the predictive power of the model without any assumptions on error costs, and possibly R2 from regressions), 5) misclassification analysis could suggest useful insights into determining where the model could be unreliable, 6) implementation stage. The proposed framework has been applied to three applications on different domains, with a view to improve previous research studies. The first developed model predicts mortality within 28 days of patients suffering from acute alcoholic hepatitis. The aim of this application is to build a new predictive model that can be used in clinical practice to identify patients at greatest risk of mortality in 28 days as they may benefit from aggressive intervention, and to monitor their progress while in hospital. A comparison generated by state of the art tools shows an improved predictive power, demonstrating how an appropriate variables inclusion may result in an overall better accuracy of the model, which increased by 25% following an appropriate variables selection process. The second proposed predictive model is designed to aid the diagnosis of dementia, as clinicians often experience difficulties in the diagnosis of dementia due to the intrinsic complexity of the process and lack of comprehensive diagnostic tools. The aim of this application is to improve on the performance of a recent application of Bayesian belief networks using an alternative approach based on logistic regression. The approach based on statistical variables selection outperformed the model which used variables selected by domain experts in previous studies. Obtained results outperform considered benchmarks by 15%. The third built model predicts the probability of experiencing a certain symptom among common side-effects in patients receiving chemotherapy. The newly developed model includes a pre-modelling stage (which was based on previous research studies) and a subsequent regression. The computed accuracy of results (computed on a daily basis for each cycle of therapy) shows that the newly proposed approach has increased its predictive power by 19% when compared to the previously developed model: this has been obtained by an appropriate usage of available a priori knowledge to pre-model the functional forms. As shown by the proposed applications, different aspects of CDSS development are subject to substantial improvements: the application of the proposed framework to different domains leads to more accurate models than the existing state-of-the-art proposals. The developed framework is capable of helping researchers to identify and overcome possible pitfalls in their ongoing research works, by providing them with best practices for each step of the development process. An impact on the development of future clinical decision support systems is envisaged: the usage of an appropriate procedure in model development will produce more reliable and accurate systems, and will have a positive impact on the newly produced medical knowledge which may eventually be included in standard clinical practice.
APA, Harvard, Vancouver, ISO, and other styles
29

Jessup, Leonard Michael. "The deindividuating effects of anonymity on automated group idea generation." Diss., The University of Arizona, 1989. http://hdl.handle.net/10150/184806.

Full text
Abstract:
Recent developments in information systems technology have made it possible for individuals to work together anonymously using networked personal computers. In this dissertation, a theory of anonymous interaction is proposed. Evidence is provided to suggest that anonymity has deindividuating effects on group process and can, therefore, influence group outcomes in several ways. Two experiments on anonymity in idea generation are presented. In Study 1, where subjects could leave at their discretion, identification kept them longer and caused them to type more, though there were no differences in the quantity or quality of the ideas across experimental conditions. In Study 2, where subjects were forced to stay, identifiability lost importance. Responsibility, however, rose in importance. Subjects with sole responsibility for their task produced more output than did subjects who shared responsibility. Taken together, these results forced us to reject the hypothesis that anonymous subjects would produce more output than would identified subjects. These results show that we cannot speak simply of the effects of anonymity on idea generation in computer-supported groups. With a straightforward interpretation of previous experiments on GDSS anonymity, it was hypothesized that anonymous subjects would produce more than identified subjects. They did not. It is clear that anonymity will lead to deindividuation, enabling participants to engage in uninhibited behavior. However, whether their behavior is productive or unproductive is determined, at least in part, by task, interaction, and technology.
APA, Harvard, Vancouver, ISO, and other styles
30

CALDEIRA, LUIZ RODOLFO NEVES. "SEMI-AUTOMATIC GENERATION OF FUNCTIONAL TEST SCRIPTS BY COMPOSING USE CASES WITH DECISION TABLES." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=16806@1.

Full text
Abstract:
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR<br>Este trabalho tem por objetivo propor um processo e ferramentas para a geração semi-automática de scripts de teste funcional para sistemas web, a partir de casos de uso e tabelas de decisão, com vistas à geração de eficazes scripts de teste automatizados, simultaneamente reduzindo o tempo gasto para gerá-los. As especificações dos testes dão-se através de descrições de casos de uso escritas usando português restrito semi-estruturado e obedecendo a uma estrutura precisamente definida. Com o auxílio de uma ferramenta, monta-se manualmente uma tabela de decisão a partir desses casos de uso. Os casos de teste semânticos são gerados automaticamente a partir destas tabelas de decisão. Outra ferramenta é responsável por gerar os scripts de testes a partir dos casos de teste semânticos. Os scripts de teste gerados devem adequar-se à ferramenta de execução automatizada dos testes. Neste trabalho, utilizou-se a ferramenta Selenium para a automação da interação com o navegador. A avaliação da eficácia deu-se através da aplicação do processo em um sistema real e através de comparação com técnicas tradicionais de geração de testes automatizados aplicadas ao mesmo sistema.<br>This work aims at developing a process and tools for the semi-automatic generation of functional test scripts for web based systems. The process and tools depart from use cases and decision tables, in order to produce high quality automated tests as well as to reduce the time spent generating them. The tests specifications are provided by use cases written in semi-structured Portuguese and obeying a well defined structure. By means of a tool, decision tables are manually built from the use case descriptions. Afterwards semantic test cases are automatically generated from these decision tables. Another tool generates executable test scripts from these test cases. The generated test scripts must suit the tool used for automated testing. In this work, the Selenium tool was used for automating test interaction with the browser. The evaluation of the efficacy of the process and tools was performed applying them to a real system and comparing the result with traditional techniques of automated test generation regarding this same system.
APA, Harvard, Vancouver, ISO, and other styles
31

Kook, Joanna. "Second generation Korean daughters : decision making in regards to caring for their elderly parents." Thesis, University of British Columbia, 2014. http://hdl.handle.net/2429/50060.

Full text
Abstract:
The thesis represents the process of decision making for second generation Korean daughters in regards to care for their parents. Using Grounded Theory principles, I analyzed the interviews of ten second generation Korean women in Vancouver and the Greater Vancouver Area of British Columbia. The core variable identified in this study was: Reformulating generational care-giving relationships. The core variable incorporated three stages of reflection for the women: 1) Contemplating Commitment, 2) Envisioning Possibilities, and 3) Re-contemplating Commitment. The women’s reformulation was affected by several factors, such as finances, parental expectations, sibling help, and the type of care that their parents would require. Re-contemplation of commitment was spurred by the women’s concerns about the level of physical care that the older adult would need. The findings indicate that while all participants would like to provide care for their parents, they would re-contemplate their commitment and the need for long-term care if the physical needs of the parent were too great or if their parents’ conditions compromised the safety of their nuclear family.<br>Applied Science, Faculty of<br>Nursing, School of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
32

Singh, Pavan Pratap. "An empirical study of the idea generation productivity of decision-making groups implications for GDSS research, design, and practice /." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0015/NQ56268.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Dowd, Kim. "Teens, Behavior Change & the Environment." Research Showcase @ CMU, 2011. http://repository.cmu.edu/theses/9.

Full text
Abstract:
This thesis document presents the research, synthesis and design work completed for a system for object reuse. This work presents a user-centered process culminating in a service design (ReUseIt) and design guidelines to be employed when working with an audience of teenage girls and designing for behavior change with respect to the environment. This document includes a literature review covering environmental concerns, the relationship of design for behavior change, Generation Z, game design, and the historic value of objects. Research methods documented include journaling kits and designer-led research workshops embedded within middle school and high school art classes. ReUseIt supports improved behavior in relation to the environment through positive feedback around the reuse of objects and attachment of stories to objects. It is a service with touchpoints in shopping malls and a Facebook application. Reflections are offered on the design process undertaken and suggested best practices for creating embedded workshops within middle and high school classes.
APA, Harvard, Vancouver, ISO, and other styles
34

Stockwell, Kathryn S. "Automatic phased mission system reliability model generation." Thesis, Loughborough University, 2013. https://dspace.lboro.ac.uk/2134/13583.

Full text
Abstract:
There are many methods for modelling the reliability of systems based on component failure data. This task becomes more complex as systems increase in size, or undertake missions that comprise multiple discrete modes of operation, or phases. Existing techniques require certain levels of expertise in the model generation and calculation processes, meaning that risk and reliability assessments of systems can often be expensive and time-consuming. This is exacerbated as system complexity increases. This thesis presents a novel method which generates reliability models for phasedmission systems, based on Petri nets, from simple input files. The process has been automated with a piece of software designed for engineers with little or no experience in the field of risk and reliability. The software can generate models for both repairable and non-repairable systems, allowing redundant components and maintenance cycles to be included in the model. Further, the software includes a simulator for the generated models. This allows a user with simple input files to perform automatic model generation and simulation with a single piece of software, yielding detailed failure data on components, phases, missions and the overall system. A system can also be simulated across multiple consecutive missions. To assess performance, the software is compared with an analytical approach and found to match within 5% in both the repairable and non-repairable cases. The software documented in this thesis could serve as an aid to engineers designing new systems to validate the reliability of the system. This would not require specialist consultants or additional software, ensuring that the analysis provides results in a timely and cost-effective manner.
APA, Harvard, Vancouver, ISO, and other styles
35

Pakdeejirakul, Warangkhana, and Micheal Agosi. "A comparative study of Swedish generation Y decision-making style between high involvement and low involvement products." Thesis, Mälardalens högskola, Akademin för ekonomi, samhälle och teknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-24113.

Full text
Abstract:
Title A comparative study of Swedish generation Y decision-making style between high involvement and low involvement products. Research questions  How does product involvement influence consumer decision-making styles in Generation Y of Swedish nationals for the two selected products?  To what level does the model proposed by Sproles and Kendall in 1986 now apply to the modern-day Generation Y in Sweden as they decide on both of the selected products? Purpose The purpose of this research undertaking was to discover and investigate the Swedish generation Y decision-making style and examine if there is a relation between product involvement and consumer decision-making style, and also to compare the extent to which the modern-day Generation Y in Sweden correspondence between age, location and product orientation not predicted by Sproles and Kendall in 1986. Method This comparison was conducted based on contemporary primary research versus what was proposed as ideal for last three generations of consumer interest groups. A quantitative research approach was used to select the primary data and answer our research questions. Conclusion Consumer buying behavior is influenced by the policy and the mental status of the buyers. According to the respondents, consumer selection can be said to be depend on the current needs and understanding of products. The study reveals that marketing needs to incorporate the realities of prevailing demographics. Consumers tend to have a decision making process that has an emotional attachment to brand, effectiveness and the perceived outcomes.
APA, Harvard, Vancouver, ISO, and other styles
36

Hamukoma, Nchimunya. "Investing in new electricity generation in South Africa : what short-circuited decision-making, 1998-2014?" Master's thesis, University of Cape Town, 2014. http://hdl.handle.net/11427/13701.

Full text
Abstract:
At the beginning of 2008, South Africa faced its most severe electricity supply crisis to date. The crisis led to a severe contraction of mining industry output and had a knock on effect on the rest of the economy. This dissertation aimed to explore how such a crisis could occur in a South Africa, when in the years leading up to the crisis, the state owned electricity utility, Eskom, had won awards as one of the lowest cost, most efficient and technologically innovative electricity companies internationally. In order to explore this, the method of the analytic narrative was used, this was supported by process tracing that identified the key period of research as the years 1998- 2004. The paper explored themes of administrative complexity, competing stakeholders and multiple objectives. It was found that the crisis could be credibly explained as having stemmed from the interaction of complex power relations across the public service in a climate of unresolved political conflict and time sensitive decision making.
APA, Harvard, Vancouver, ISO, and other styles
37

Abraham, Johnson Anthony Raj. "Multi-objective information generation, extraction and presentation within interactive evolutionary design and decision making systems." Thesis, University of the West of England, Bristol, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.441823.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Valente, Christian. "Design and architecture of a stochastic programming modelling system." Thesis, Brunel University, 2011. http://bura.brunel.ac.uk/handle/2438/6249.

Full text
Abstract:
Decision making under uncertainty is an important yet challenging task; a number of alternative paradigms which address this problem have been proposed. Stochastic Programming (SP) and Robust Optimization (RO) are two such modelling ap-proaches, which we consider; these are natural extensions of Mathematical Pro-gramming modelling. The process that goes from the conceptualization of an SP model to its solution and the use of the optimization results is complex in respect to its deterministic counterpart. Many factors contribute to this complexity: (i) the representation of the random behaviour of the model parameters, (ii) the interfac-ing of the decision model with the model of randomness, (iii) the difficulty in solving (very) large model instances, (iv) the requirements for result analysis and perfor-mance evaluation through simulation techniques. An overview of the software tools which support stochastic programming modelling is given, and a conceptual struc-ture and the architecture of such tools are presented. This conceptualization is pre-sented as various interacting modules, namely (i) scenario generators, (ii) model generators, (iii) solvers and (iv) performance evaluation. Reflecting this research, we have redesigned and extended an established modelling system to support modelling under uncertainty. The collective system which integrates these other-wise disparate set of model formulations within a common framework is innovative and makes the resulting system a powerful modelling tool. The introduction of sce-nario generation in the ex-ante decision model and the integration with simulation and evaluation for the purpose of ex-post analysis by the use of workflows is novel and makes a contribution to knowledge.
APA, Harvard, Vancouver, ISO, and other styles
39

Hill, Justin Mitchell. "Shaping the Next Generation Air Transportation System with an Airspace Planning and Collaborative Decision Making Model." Diss., Virginia Tech, 2009. http://hdl.handle.net/10919/39319.

Full text
Abstract:
This dissertation contributes to the ongoing national project concerning the \emph{Next Generation Air Transportation System} (NextGen) that endeavors, in particular, to reshape the management of air traffic in the continental United States. Our work is part of this effort and mainly concerns modeling and algorithmic enhancements to the Airspace Planning and Collaborative Decision-Making Model (APCDM). First, we augment the APCDM to study an \emph{Airspace Flow Program} (AFP) in the context of weather-related disruptions. The proposed model selects among alternative flight plans for the affected flights while simultaneously (a) integrating slot-exchange mechanisms induced by multiple Ground Delay Programs (GDPs) to permit airlines to improve flight efficiencies through a mediated bartering of assigned slots, and (b) considering issues related to sector workloads, airspace conflicts, as well as overall equity concerns among the involved airlines in regard to accepted slot trades and flight plans. More specifically, the APCDM is enhanced to include the following: \begin{enumerate}[a.] \item The revised model accommodates continuing flights, where some flight cannot depart until a prerequisite flight has arrived. Such a situation arises, for example, when the same aircraft will be used for the departing flight. \item We model a slot-exchange mechanism to accommodate flights being involved in multiple trade offers, and to permit slot trades at multiple GDP airports (whence the flight connection constraints become especially relevant). We also model flight cancelations whereby, if a flight assigned to a particular slot is canceled, the corresponding vacated slot would be made available for use in the slot-exchange process. \item Alternative equity concepts are presented, which more accurately reflect the measures used by the airlines. \item A reduced variant of the APCDM, referred to as \textbf{APCDM-Light}, is also developed. This model serves as a fast-running version of APCDM to be used for quick-turn analyses, where the level of modeling detail, as well as data requirements, are reduced to focus only on certain key elements of the problem. \item As an alternative for handling large-scale instances of APCDM more effectively, we present a \emph{sequential variable fixing heuristic} (SFH). The list of flights is first partitioned into suitable subsets. For the first subset, the corresponding decision variables are constrained to be binary-valued (which is the default for these decision variables), while the other variables are allowed to vary continuously between 0 and 1. If the resulting solution to this relaxed model is integral, the algorithm terminates. Otherwise, the binary variables are fixed to their currently prescribed values and another subset of variables is designated to be binary constrained. The process repeats until an integer solution is found or the heuristic encounters infeasibility. \item We experiment with using the APCDM model in a \emph{dynamic, rolling-horizon framework}, where we apply the model on some periodic basis (e.g., hourly), and where each sequential run of the model has certain flight plan selections that are fixed (such as flights that are already airborne), while we consider the selection among alternative flight plans for other imminent flights in a look-ahead horizon (e.g., two hours). \end{enumerate} These enhancements allow us to significantly expand the functionality of the original APCDM model. We test the revised model and its variants using realistic data derived from the \emph{Enhanced Traffic Management System} (ETMS) provided by the \emph{Federal Aviation Administration} (FAA). One of the new equity methods, which is based on average delay per passenger (or weighted average delay per flight), turns out to be a particularly robust way to model equity considerations in conjunction with sector workloads, conflict resolution, and slot-exchanges. With this equity method, we were able to solve large problem instances (1,000 flights) within 30 seconds on average using a 1\% optimality tolerance. The model also produced comparable solutions within about 20 seconds on average using the Sequential Fixing Heuristic (SFH). The actual solutions obtained for these largest problem instances were well within 1\% of the best known solution. Furthermore, our computations revealed that APCDM-Light can be readily optimized to a 0.01\% tolerance within about 5 seconds on average for the 1,000 flight problems. Thus, the augmented APCDM model offers a viable tool that can be used for tactical air traffic management purposes as an airspace flow program (particularly, APCDM-Light), as well as for strategic applications to study the impact of different types of trade restrictions, collaboration policies, equity concepts, and airspace sectorizations. The modeling of slot ownership in the APCDM motivates another problem: that of generating detoured flight plans that must arrive at a particular slot time under severe convective weather conditions. This leads to a particular class of network flow problems that seeks a shortest path, if it exists, between a source node and a destination node in a connected digraph $G(N,A)$, such that we arrive at the destination at a specified time while leaving the source no earlier than a lower bounding time, and where the availability of each network link is time-dependent in the sense that it can be traversed only during specified intervals of time. We refer to this problem as the \emph{reverse time-restricted shortest path problem} (RTSP). We show that RTSP is NP-hard in general and propose a dynamic programming algorithm for finding an optimal solution in pseudo-polynomial time. Moreover, under a special regularity condition, we prove that the problem is polynomially solvable with a complexity of order $O(|N||A|)$. Computational results using real flight generation test cases as well as random simulated problems are presented to demonstrate the efficiency of the proposed solution procedures. The current airspace configuration consists of sectors that have evolved over time based on historical traffic flow patterns. \citet{kopardekar_dyn_resect_2007} note that, given the current airspace configuration, some air traffic controller resources are likely under-utilized, and they also point out that the current configuration limits flexibility. Moreover, under the free-flight concept, which advocates a relaxation of waypoint traversals in favor of wind-optimized trajectories, the current airspace configuration will not likely be compatible with future air traffic flow patterns. Accordingly, one of the goals for the \emph{NextGen Air Transportation System} includes redesigning the airspace to increase its capacity and flexibility. With this motivation, we present several methods for defining sectors within the \emph{National Airspace System} (NAS) based on a measure of sector workload. Specifically, given a convex polygon in two-dimensions and a set of weighted grid points within the region encompassed by the polygon, we present several mixed-integer-programming-based algorithms to generate a plane (or line) bisecting the region such that the total weight distribution on either side of the plane is relatively balanced. This process generates two new polygons, which are in turn bisected until some target number of regions is reached. The motivation for these algorithms is to dynamically reconfigure airspace sectors to balance predicted air-traffic controller workload. We frame the problem in the context of airspace design, and then present and compare four algorithmic variants for solving these problems. We also discuss how to accommodate monitoring, conflict resolution, and inter-sector coordination workloads to appropriately define grid point weights and to conduct the partitioning process in this context. The proposed methodology is illustrated using a basic example to assess the overall effect of each algorithm and to provide insights into their relative computational efficiency and the quality of solutions produced. A particular competitive algorithmic variant is then used to configure a region of airspace over the U.S. using realistic flight data. The development of the APCDM is part of an ongoing \emph{NextGen} research project, which envisages the sequential use of a variety of models pertaining to three tiers. The \emph{Tier 1} models are conceived to be more strategic in scope and attempt to identify potential problematic areas, e.g., areas of congestion resulting from a severe convective weather system over a given time-frame, and provide aggregate measures of sector workloads and delays. The affected flow constrained areas (FCAs) highlighted by the results from these \emph{Tier 1} models would then be analyzed by more detailed \emph{Tier 2} models, such as APCDM, which consider more specific alternative flight plan trajectories through the different sectors along with related sector workload, aircraft conflict, and airline equity issues. Finally, \emph{Tier 3} models are being developed to dynamically examine smaller-scaled, localized fast-response readjustments in air traffic flows within the time-frame of about an hour prior to departure (e.g., to take advantage of a break in the convective weather system). The APCDM is flexible, and perhaps unique, in that it can be used effectively in all three tiers. Moreover, as a strategic tool, analysts could use the APCDM to evaluate the suitability of potential airspace sectorization strategies, for example, as well as identify potential capacity shortfalls under any given sector configuration.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
40

Sampson, Adrienne V. "The Role of Supports, Barriers and Coping Efficacy in First-Generation College Students' Career Decision Outcomes." University of Akron / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=akron1479082516296368.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Plachkov, Alex. "Soft Data-Augmented Risk Assessment and Automated Course of Action Generation for Maritime Situational Awareness." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/35336.

Full text
Abstract:
This thesis presents a framework capable of integrating hard (physics-based) and soft (people-generated) data for the purpose of achieving increased situational assessment (SA) and effective course of action (CoA) generation upon risk identification. The proposed methodology is realized through the extension of an existing Risk Management Framework (RMF). In this work, the RMF’s SA capabilities are augmented via the injection of soft data features into its risk modeling; the performance of these capabilities is evaluated via a newly-proposed risk-centric information fusion effectiveness metric. The framework’s CoA generation capabilities are also extended through the inclusion of people-generated data, capturing important subject matter expertise and providing mission-specific requirements. Furthermore, this work introduces a variety of CoA-related performance measures, used to assess the fitness of each individual potential CoA, as well as to quantify the overall chance of mission success improvement brought about by the inclusion of soft data. This conceptualization is validated via experimental analysis performed on a combination of real- world and synthetically-generated maritime scenarios. It is envisioned that the capabilities put forth herein will take part in a greater system, capable of ingesting and seamlessly integrating vast amounts of heterogeneous data, with the intent of providing accurate and timely situational updates, as well as assisting in operational decision making.
APA, Harvard, Vancouver, ISO, and other styles
42

Prince, Bradley Justin Cegielski Casey. "An exploration of the impact of speech recognition technologies on group efficiency and effectiveness during an electronic idea generation scenario." Auburn, Ala., 2006. http://repo.lib.auburn.edu/2006%20Spring/doctoral/PRINCE_BRADLEY_15.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Valacich, Joseph S. "Group size and proximity effects on computer-mediated idea generation: A laboratory investigation." Diss., The University of Arizona, 1989. http://hdl.handle.net/10150/184829.

Full text
Abstract:
This dissertation investigated the effects of group size, group member proximity and the interaction of these two variables on the performance of brainstorming groups in a synchronous, computer-mediated environment. A laboratory experiment was employed to manipulate the independent variables group size (4- and 8-member) group member proximity. Group member proximity was manipulated by allowing proximate groups to work in a single meeting room, while members of distributed group worked in separate rooms. The subjects, upper-level, undergraduate business students, were asked to identify and discuss all "people, groups and organizations" that would be affected by a proposed policy to require all undergraduate business students to have individual access to a personal computer. The computer-mediated brainstorming system allowed all group members to enter and share information simultaneously, as all communication was electronic. Group performance was assessed by counting the total number of unique solutions generated and by the sum of expert rated quality scores for each unique solution. Groups in all conditions contributed approximately the same number of comments and felt equally satisfied. Contrary to an ample body of noncomputer-mediated brainstorming research, large groups were more productive than small groups for both idea quantity and quality. Small groups were, however, more productive than large groups on a per person basis, as increased group size yielded diminishing returns. Remote groups were more productive than proximate groups. Group researchers have found that group interaction produces productivity gains and losses, each of which increase in strength as the group size increases. This research found group productivity losses for computer-mediated brainstorming to be relatively constant, as the technology mitigated productivity inhibitors in conditions where prior noncomputer-mediated research has found these losses to increase (i.e., larger groups).
APA, Harvard, Vancouver, ISO, and other styles
44

Pan, Jiuping. "MADM Framework for Strategic Resource Planning of Electric Utilities." Diss., Virginia Tech, 1999. http://hdl.handle.net/10919/30138.

Full text
Abstract:
This study presents a multi-attribute decision making (MADM) framework in support of strategic resource planning of electric utilities. Study efforts have focused on four technical issues identified to be essentially important to the process of strategic resource development, i.e., decision data expansion, MADM analysis with imprecise information, MADM analysis under uncertainty and screening applications. Main contributions from this study are summarized as follows. First, an automatic learning method is introduced for decision data expansion aiming at reducing the amount of computations involved in the creation of decision database. Test results have shown that the proposed method is feasible, easy to implement, and more accurate than the techniques available in the existing literature. Second, an interval-based MADM methodology is developed, which extends the traditional utility function model with the measure of composite utility variance, accounting for individual errors from inaccurate attribute measurements and inconsistent priority judgments. This enhanced decision approach would help the decision-maker (DM) gain insight into how the imprecise data may affect the choice toward the best solution and how a range of acceptable alternatives may be identified with certain confidence. Third, an integrated MADM framework is developed for multi-attribute planning under uncertainty which combines attractive features of utility function, tradeoff/risk analysis and analytical hierarchy process and thus provides a structured decision analysis platform accommodating both probabilistic evaluation approach and risk evaluation approach. Fourth, the application of screening models is investigated in the context of integrated resource planning of electric utilities as to identify cost effective demand-side options and robust generation expansion planning schemes.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
45

Melero, Nogués Maria Teresa. "Combining machine learning and rule-based approaches in Spanish syntactic generation." Doctoral thesis, Universitat Pompeu Fabra, 2006. http://hdl.handle.net/10803/7501.

Full text
Abstract:
Aquesta tesi descriu una gramàtica de Generació que combina regles escrites a mà i tècniques d'aprenentatge automàtic. Aquesta gramàtica pertany a un sistema de Traducció Automàtica de qualitat comercial desenvolupat a Microsoft Research. La primera part presenta la gramàtica i les principals estratègies lingüístiques que aquesta gramàtica implementa. Els requeriments de robustesa que reclama l'ús real del sistema de TA, exigeix del Generador un esforç suplementari que es resol afegint un nivell de pre-generació, capaç de garantir la integritat de l'entrada, sense incorporar elements ad-hoc en les regles de la gramàtica. A la segona part, explorem l'ús dels classificadors d'arbres de decisió (DT) per tal d'aprendre automàticament una de les operacions que tenen lloc al mòdul de pre-generació, en concret la selecció lèxica del verb copulatiu en espanyol (ser o estar). Mostrem que és possible inferir a partir d'exemples els contextos per aquest fenòmen lingüístic no trivial, amb gran precisió.<br>This thesis describes a Spanish Generation grammar which combines hand-written rules and Machine Learning techniques. This grammar belongs to a full-scale commercial quality Machine Translation system developed at Microsoft Research. The first part presents the grammar and the linguistic strategies it embodies. The need for robustness in real-world situations in the everyday use of the MT system requires from the Generator an extra effort which is resolved by adding a Pre-Generation layer which is able to fix the input to Generation, without contaminating the grammar rules. In the second part we explore the use of Decision Tree classifiers (DT) for automatically learning one of the operations that take place in the Pre-Generation component, namely lexical selection of the Spanish copula (i.e. ser and estar). We show that it is possible to infer from examples the contexts for this non-trivial linguistic phenomenon with high accuracy.
APA, Harvard, Vancouver, ISO, and other styles
46

Dacorso, Antonio Luis Rocha. "Análise experimental da geração de alternativas em decisões estratégicas não estruturadas." Universidade de São Paulo, 2005. http://www.teses.usp.br/teses/disponiveis/12/12139/tde-02062005-142151/.

Full text
Abstract:
A decisão estratégica é tipicamente não estruturada, no sentido de que não existe processo semelhante na memória da organização. Essa estruturação corresponde ao cenário exposto de forma clara, com suas questões, hipóteses e objetivos. A qualidade da decisão estratégica depende principalmente do processo e da competência daqueles que participam dela. Gerar alternativas criativas e viáveis é uma etapa fundamental do processo de tomada de decisão, responsável em grande parte pela qualidade almejada. Entretanto, as pesquisas sobre geração de alternativas têm indicado consistentemente que as pessoas não são eficientes nessa atividade. Buscando-se as explicações para esse fato surgiram lacunas na literatura que serviram de inspiração para o presente estudo. Qual a influência das heurísticas e da isolação entre as etapas convergente e divergente na geração de alternativas? Para explorar essas lacunas e conhecer o desempenho do administrador brasileiro ao gerar alternativas foi realizado um experimento com 174 alunos de cursos MBA, de 4 escolas da Grande São Paulo. O resultado desse experimento propiciou algumas conclusões interessantes como a confirmação do baixo desempenho em geração de alternativas. O desafio para preencher as lacunas observadas permanece e as hipóteses da pesquisa, relacionando as heurísticas e a isolação como fatores que influenciam o desempenho, não foram aceitas. O estudo é uma confluência da pesquisa experimental, oriunda da psicologia cognitiva da decisão, com a visão da ciência da decisão organizacional. Essa linha de pesquisa se mostrou praticamente inexplorada nos estudos em administração desenvolvidos no Brasil.<br>The strategic decision-making is typically non-structured because there is no similarity process in the memory of the organizations. This structural model would involve a context of elements such as questions, hypotheses and objectives exposed in a quite clear way. The quality of the strategic decision-making depends mainly on its own process and on the competence of the individuals involved. An essential part of the making decision process is to generate creative and viable options that are also responsible for the quality of the process. However, researches on option-generating procedures have consistently suggested that people are not efficient in this kind of task. The present study was inspired by the fact that the current literature lacks studies explaining the reasons why this happen. What is the influence of the heuristics and isolation strategies on the convergent and divergent phases for generating options? In order to investigate this issue and understand the performance of the Brazilian managers for generating options, an experiment was conduct in 174 individuals studying in MBA courses at 4 educational institutions in the urban city area of São Paulo. The results led to some interesting conclusions such as the confirmation of the poor performance for generating options. The challenge and the lack remain mainly because the hypotheses tested, i.e., the heuristic and isolation strategies for generating options, were not effective. The present study is a confluence of two different decision approaches: the experimental research (based on cognitive psychology) and the management science. This line of research hasn’t been explored in the management field in Brazil.
APA, Harvard, Vancouver, ISO, and other styles
47

Irwin, Mary A. "Towards Understanding the Negotiation and Decision-Making Process of Withdrawal from College: A Qualitative Approach." Diss., The University of Arizona, 2010. http://hdl.handle.net/10150/196144.

Full text
Abstract:
This qualitative research project focused on the interviews of 27 low socio-economic students at a research university in the southwestern United States. The students had already withdrawn from the university or were in the process of withdrawing. The study seeks to provide increased understanding of how students negotiate the decision-making process to withdraw from the first university they attended after high school. The theoretical lenses of student departure theories (Astin, 1993; Bean, 1983; Tierney, 1992; and Tinto, 1993) and decision-making theories (Becker, 1976; Frank, 1987; Kahneman, 2003; March, 1994; Scott, 2000) were combined. The Decision-Making Process Model of Student Departure is offered as a new theoretical framework that combines decision-making theories and student retention theories. This conceptualization is unlike other student departure models because it includes the proposition that forces push at the student from within the institution and forces pull them from outside the institution. In addition, it is different from other student departure models because it includes the discussion about how students think about their process to withdraw - it is not meant to describe their behaviors. Financial, academic and psychological stresses (from both within and outside the institution) influenced how the students negotiated the decision-making process to leave the institution. The students did not seek out institutional agents (advisors or faculty members) for advice when they were struggling academically. They developed their own strategies or went to their family members for advice, many of whom had never been to college.
APA, Harvard, Vancouver, ISO, and other styles
48

SUI, ZHENHUAN. "Hierarchical Text Topic Modeling with Applications in Social Media-Enabled Cyber Maintenance Decision Analysis and Quality Hypothesis Generation." The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1499446404436637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Krishnankutty, Nair Rajamma Rajasree. "An empirical investigation of the salient dimensions of Baby Boomer and Generation Y consumers' health care decision choices." Thesis, University of North Texas, 2006. https://digital.library.unt.edu/ark:/67531/metadc5362/.

Full text
Abstract:
The purpose of this research is to empirically investigate consumers' health care decision choices in a dynamic market setting. The unprecedented demands on the U.S. health care system coupled with the mounting controversies surrounding health care reform suggest that consumers' health care decisions warrant empirical research attention. Toward this end, this dissertation empirically explored (1) the characteristics of consumers who possess a willingness to use non-conventional treatments over conventional treatments, (2) the characteristics of consumers who elect self-medication in lieu of health care practitioner-directed medication, and (3) the salient dimensions of consumers' channel choice for the procurement of health care products. Each of these decision choice factors were tested across two U.S. generational segments to assess whether differences existed across Baby Boomers' and Gen Yers' health care decision choices. The conceptual framework for empirical assessment is Bandura's (1986) social cognitive theory. From Bandura's social cognitive theory, a general model of healthcare decision choice is proposed to assess consumers' states of mind, states of being and states of action (decision choice). Results indicate that social cognitive factors (e.g., self-efficacy, objectivism) play an important role in each of the decision domains explored in this dissertation. Moreover, health value was found to be an important moderator between the social cognitive factors and health care decision choices. The predictors of the health care decision choices were found to vary across the Baby Boomers and Generation Yers on several dimensions, confirming the notion that generational differences may be a salient dimension of consumers' health care decision choice. The research offers several implications for practitioners, academicians and policy makers. Both descriptive and normative implications are gleaned from the research findings. Most notably, the results indicate that consumers' social cognitive factors and health value may be mechanisms for managing health care decisions.
APA, Harvard, Vancouver, ISO, and other styles
50

Doubleday, Kevin. "Generation of Individualized Treatment Decision Tree Algorithm with Application to Randomized Control Trials and Electronic Medical Record Data." Thesis, The University of Arizona, 2016. http://hdl.handle.net/10150/613559.

Full text
Abstract:
With new treatments and novel technology available, personalized medicine has become a key topic in the new era of healthcare. Traditional statistical methods for personalized medicine and subgroup identification primarily focus on single treatment or two arm randomized control trials (RCTs). With restricted inclusion and exclusion criteria, data from RCTs may not reflect real world treatment effectiveness. However, electronic medical records (EMR) offers an alternative venue. In this paper, we propose a general framework to identify individualized treatment rule (ITR), which connects the subgroup identification methods and ITR. It is applicable to both RCT and EMR data. Given the large scale of EMR datasets, we develop a recursive partitioning algorithm to solve the problem (ITR-Tree). A variable importance measure is also developed for personalized medicine using random forest. We demonstrate our method through simulations, and apply ITR-Tree to datasets from diabetes studies using both RCT and EMR data. Software package is available at https://github.com/jinjinzhou/ITR.Tree.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography