To see the other types of publications on this topic, follow the link: Systematic generation.

Dissertations / Theses on the topic 'Systematic generation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Systematic generation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Seidler, Anna Lene Dora. "Next generation systematic review methodology." Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/24554.

Full text
Abstract:
Systematic reviews and meta-analyses are widely used to inform guidelines, policy, and practice. Yet, there are several limitations associated with traditional systematic reviews. Potential sources of bias, such as publication bias and selective outcome reporting, can produce misleading results, and when individual studies collect different outcomes or use different measures to assess the same outcomes, this can make them difficult and sometimes impossible to synthesise. Traditional aggregate data meta-analyses give estimates about average effects, but provide limited reliable information on whether intervention effects vary across different populations, or whether differences between intervention characteristics may lead to differential effects. This is particularly problematic in an era that is steering away from a one-size-fits-all approach and toward precision medicine. In addition, traditional meta-analyses only include head-to-head comparisons of two interventions at a time when in reality, there are often more than two options that practitioners need to choose between. To explore these limitations and propose solutions, this thesis presents a series of nine manuscripts.
APA, Harvard, Vancouver, ISO, and other styles
2

Long, Suella. "Systematic generation of engineering line diagrams." Thesis, Loughborough University, 1999. https://dspace.lboro.ac.uk/2134/14135.

Full text
Abstract:
This thesis describes research into a methodology for the systematic development of engineering line diagrams (ELOs) from process tlowsheets with a particular emphasis on safety, health and environmental (SHE) and operability issues. The current approach to the consideration of safety in design is largely reactive, relying on design reviews such as the HAZOP. If design safety is to be improved, then a comprehensive system, incorporating both proactive and reactive methods, must be adopted. The facility to develop proactive safety systems relies upon the presence of a systematic design procedure. Since design at this stage seems generally to be rather haphazard, there is a need to introduce structure to the design task before any progress can be made in the improvement of safety. Introducing structure to the design task not only provides a framework for the incorporation of SHE and operability issues, but should also improve the effectiveness of the overall design and the efficiency with which it is completed. More specifically, fewer good design opportunities should be lost due to poor information handling and thc amount of rework arising from misunderstandings between different disciplines should be minimised. In addition, learning how to perform the design task should become easier for new recruits. Relevant work in the fields of process design, process safety, engineering drawings and ELO development is discussed. An analysis of perceptions of the design task within industry is presented. The generation of a systematic method by iterative case study work with designers is described. The structural features of this method are explained. Some examples of the application of the method are given and the results of a trial within industry are discussed. This research has shown that there is no existing work which captures the logic for the order in which decisions for developing a first ELO are made. Neither is there a complete analysis of the activities and issues contributing to ELO development. A novel method for the systematic generation of ELOs has been produced and used as a framework for the incorporation of SHE and operability issues into design. Trials of the method within industry have shown it to be successful.
APA, Harvard, Vancouver, ISO, and other styles
3

Ylitalo, P. (Pekka). "Value creation metrics in systematic idea generation." Doctoral thesis, Oulun yliopisto, 2017. http://urn.fi/urn:isbn:9789526215334.

Full text
Abstract:
Abstract The ability to generate creative new ideas to develop innovative products and optimize processes has become crucially important for organizations’ survival in the competitive and turbulent market environment. The objective of this dissertation was to examine the value creation of systematic idea generation in a quantitative fashion by defining a new evaluation methodology. This called for designing a set of quantitative value creation metrics for creative idea generation. This study aims to enhance current knowledge of idea generation measurement that has so far mainly focused on human judgment-based methods. The goals of this exploratory study were formulated into two research questions. The first addressed the subject from a conceptual viewpoint, while the second involved a single-case study in the automotive industry. Documentation-based approaches were used as the primary means of data collection to maximize the quantitative accuracy of the measurement. Single-case study was selected as the research method because of the deep-level data access required to thoroughly assess an idea generation process. Patent information-based measures were adopted in the case organization to gain insights into the effectiveness of its idea generation process. The research data was collected from the case organization’s management information systems. Data analysis allowed for a fact-based evaluation of the idea generation process, which would not have been possible with traditional qualitative measures. The results demonstrate that idea generation can be measured rigorously like most other processes when proper metrics are in place. Nevertheless, several limitations were identified that must be considered when discussing further applications of the proposed measures in other organizations. This thesis proposes a new analysis method for the creative phase of the product creation process. Future studies could build on this model, for example, by experimenting with the indicators proposed by this study in other contexts or designing a set of “rival” value creation metrics for idea generation<br>Tiivistelmä Kyky luoda uusia ideoita innovatiivisten tuotteiden kehitystä ja prosessien optimointia varten on elintärkeää organisaatioiden selviytymisen kannalta vahvasti kilpailluilla markkinoilla. Tämän tutkimuksen tavoitteena oli tutkia arvonluontia systemaattisessa ideoiden luonnissa määrällisin menetelmin kehittämällä uusi arviointimenetelmä. Tässä yhteydessä oli määritettävä joukko määrällisiä arvonluontimittareita luovalle ideoiden luonnille. Tutkimus pyrkii täydentämään olemassa olevaa tietämystä ideoiden luonnin mittaamisesta, joka on tähän asti pääasiassa keskittynyt laadullisiin menetelmiin. Tutkimuksen tavoitteista johdettiin kaksi tutkimuskysymystä. Ensimmäinen kysymys käsitteli aihepiiriä teoreettisesta näkökulmasta, ja toinen liittyi autoteollisuudessa tehtävään tapaustutkimukseen. Empiirisen aineiston keruussa hyödynnettiin ensisijaisesti asiakirjapohjaisia lähestymistapoja määrällisen mittauksen tarkkuuden maksimoimiseksi. Tutkimusmenetelmäksi valittiin yhden tapauksen tutkimus, koska ideoidenluontiprosessin syvällinen analyysi vaatii pääsyä yksityiskohtaiseen yritysaineistoon, mikä ei monitapaustutkimuksessa olisi ollut mahdollista. Ideoiden luontiprosessin tehokkuuden arviointia varten tapausorganisaatiossa otettiin käyttöön patenttitietoon pohjautuvia indikaattoreita. Aineisto kerättiin organisaation tietojärjestelmissä. Aineiston tarkastelu mahdollisti ideoiden luonti-prosessissa tapahtuvan arvonluonnin konkreettisen havainnollistamisen, mikä ei olisi ollut mahdollista perinteisten laadullisten mittarien avulla. Tutkimuksen tulokset osoittavat, että luovaa ideoiden luontia pystytään mittaamaan useimpien muiden yritysprosessien tapaan täsmällisesti, mikäli sovelletaan asianmukaisia indikaattoreita. Tutkimus nosti kuitenkin esiin tiettyjä tekijöitä, jotka rajoittavat tässä työssä esiteltyjen tehokkuuden mittarien käyttöä muissa organisaatioissa. Tämä työ esittää uuden arviointimenetelmän, jota voidaan soveltaa tuotekehitysprosessin luovassa vaiheessa. Aihetta käsittelevät jatkotutkimukset voisivat laajentaa tässä työssä kehitettyä mallia esimerkiksi kokeilemalla esiteltyjen indikaattorien käyttöönottoa muissa konteksteissa tai määrittelemällä vaihtoehtoiset arvonluontimittarit, jotka eivät jaa samoja rajoitteita
APA, Harvard, Vancouver, ISO, and other styles
4

Mahmood, Shahid. "A Systematic Review of Automated Test Data Generation Techniques." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4349.

Full text
Abstract:
Automated Test Data Generation (ATDG) is an activity that in the course of software testing automatically generates test data for the software under test (SUT). It usually makes the testing more efficient and cost effective. Test Data Generation (TDG) is crucial for software testing because test data is one of the key factors for determining the quality of any software test during its execution. The multi-phased activity of ATDG involves various techniques for each of its phases. This research field is not new by any means, albeit lately new techniques have been devised and a gradual increase in the level of maturity has brought some diversified trends into it. To this end several ATDG techniques are available, but emerging trends in computing have raised the necessity to summarize and assess the current status of this area particularly for practitioners, future researchers and students. Further, analysis of the ATDG techniques becomes even more important when Miller et al. [4] highlight the hardship in general acceptance of these techniques. Under this scenario only a systematic review can address the issues because systematic reviews provide evaluation and interpretation of all available research relevant to a particular research question, topic area, or phenomenon of interest. This thesis, by using a trustworthy, rigorous, and auditable methodology, provides a systematic review that is aimed at presenting a fair evaluation of research concerning ATDG techniques of the period 1997-2006. Moreover it also aims at identifying probable gaps in research about ATDG techniques of defined period so as to suggest the scope for further research. This systematic review is basically presented on the pattern of [5 and 8] and follows the techniques suggested by [1].The articles published in journals and conference proceedings during the defined period are of concern in this review. The motive behind this selection is quite logical in the sense that the techniques that are discussed in literature of this period might reflect their suitability for the prevailing software environment of today and are believed to fulfill the needs of foreseeable future. Furthermore only automated and/or semiautomated ATDG techniques have been chosen for consideration while leaving the manual techniques as they are out of the scope. As a result of the preliminary study the review identifies ATDG techniques and relevant articles of the defined period whereas the detailed study evaluates and interprets all available research relevant to ATDG techniques. For interpretation and elaboration of the discovered ATDG techniques a novel approach called ‘Natural Clustering’ is introduced. To accomplish the task of systematic review a comprehensive research method has been developed. Then on the practical implications of this research method important results have been gained. These results have been presented in statistical/numeric, diagrammatic, and descriptive forms. Additionally the thesis also introduces various criterions for classification of the discovered ATDG techniques and presents a comprehensive analysis of the results of these techniques. Some interesting facts have also been highlighted during the course of discussion. Finally, the discussion culminates with inferences and recommendations which emanate from this analysis. As the research work produced in the thesis is based on a rich amount of trustworthy information, therefore, it could also serve the purpose of being an upto- date guide about ATDG techniques.<br>Shahid Mahmood Folkparksvägen 14:23 372 40 Ronneby Sweden +46 76 2971676
APA, Harvard, Vancouver, ISO, and other styles
5

Kabalan, Bilal. "Systematic methodology for generation and design of hybrid vehicle powertrains." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSE1048.

Full text
Abstract:
Pour répondre aux objectifs de consommation des flottes de véhicules, au normes d’émissions de polluants et aux nouvelles demandes de l’usager, les constructeurs automobiles doivent développer des motorisations hybrides et électriques. Réaliser une chaine de traction hybride reste cependant une tâche difficile. Ces systèmes sont complexes et possèdent de nombreuses variables réparties sur différents niveaux : architecture, technologie des composants, dimensionnement et contrôle/commande. L’industrie manque encore d’environnements et d’outils pouvant aider à l’exploration de l’ensemble de l’espace de dimensionnement et à trouver la meilleure solution parmi tous ces niveaux. Cette thèse propose une méthodologie systématique pour répondre au moins partiellement à ce besoin. Partant d’un ensemble de composants, cette méthodologie permet de générer automatiquement tous les graphes d’architectures possibles en utilisant la technique de programmation par contraintes. Une représentation dédiée est développée pour visualiser ces graphes. Les éléments de boites de vitesse (embrayages, synchroniseurs) sont représentés avec un niveau de détails approprié pour générer de nouvelles transmission mécaniques sans trop complexifier le problème. Les graphes obtenus sont ensuite transformés en d’autres types de représentation : 0ABC Table (décrivant les connections mécaniques entre les composants), Modes Table (décrivant les modes de fonctionnement disponibles dans les architectures) et Modes Table + (décrivant pour chaque mode le rendement et le rapport de réduction global des chemins de transfert de l’énergie entre tous les composants). Sur la base de cette représentation, les nombreuses architectures générées sont filtrées et seules les plus prometteuses sont sélectionnées. Elles sont ensuite automatiquement évaluées et optimisées avec un modèle général spécifiquement développé pour calculer les performances et la consommation de toute les architectures générées. Ce modèle est inséré dans un processus d’optimisation à deux niveaux ; un algorithme génétique GA est utilisé pour le dimensionnement des composants et la programmation dynamique est utilisée au niveau contrôle (gestion de l’énergie) du système. Un cas d’étude est ensuite réalisé pour montrer le potentiel de cette méthodologie. Nous générons ainsi automatiquement toutes les architectures qui incluent un ensemble de composants défini à l’avance, et le filtrage automatique élimine les architectures présupposées non efficaces et sélectionnent les plus prometteuses pour l’optimisation. Les résultats montrent que la méthodologie proposée permet d’aboutir à une architecture meilleure (consommation diminuée de 5%) que celles imaginées de prime abord (en dehors de toute méthodologie)<br>To meet the vehicle fleet-wide average CO2 targets, the stringent pollutant emissions standards, and the clients’ new demands, the automakers realized the inevitable need to offer more hybrid and electric powertrains. Designing a hybrid powertrain remains however a complex task. It is an intricate system involving numerous variables that are spread over different levels: architecture, component technologies, sizing, and control. The industry lacks frameworks or tools that help in exploring the entire design space and in finding the global optimal solution on all these levels. This thesis proposes a systematic methodology that tries to answer a part of this need. Starting from a set of chosen components, the methodology automatically generates all the possible graphs of architectures using constraint-programming techniques. A tailored representation is developed to picture these graphs. The gearbox elements (clutches, synchronizer units) are represented with a level of details appropriate to generate the new-trend dedicated hybrid gearboxes, without making the problem too complex. The graphs are then transformed into other types of representation: 0ABC Table (describing the mechanical connections between the components), Modes Table (describing the available modes in the architectures) and Modes Table + (describing for each available mode the global efficiency and ratio of the power flow between all the components). Based on these representations, the architectures are filtered and the most promising ones are selected. They are automatically assessed and optimized using a general hybrid model specifically developed to calculate the performance and fuel consumption of all the generated architectures. This model is inserted inside a bi-level optimization process: Genetic Algorithm GA is used on the sizing and components level, while Dynamic Programming DP is used on the control level. A case study is performed and the capability of the methodology is proven. It succeeded in automatically generating all the graphs of possible architectures, and filtering dismissed architectures that were then proven not efficient. It also selected the most promising architectures for optimization. The results show that the proposed methodology succeeded in finding an architecture better than the ones proposed without the methodology (consumption about 5% lower)
APA, Harvard, Vancouver, ISO, and other styles
6

Lee, William. "Next generation technologies for systematic analysis of DNA structure and repair /." May be available electronically:, 2008. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Stürmer, Ingo. "Systematic testing of code generation tools a test suite oriented approach for safeguarding model based code generation." Berlin Pro Business, 2006. http://deposit.ddb.de/cgi-bin/dokserv?id=2788859&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Dong, Yuanwei. "A systematic study of silicon germanium interdiffusion for next generation semiconductor devices." Thesis, University of British Columbia, 2014. http://hdl.handle.net/2429/47120.

Full text
Abstract:
SiGe heterostructures with higher Ge fractions and larger Ge modulations, and thus higher compressive stress, are key structures for next-generation electronic and optoelectronic devices. Si-Ge interdiffusion during high temperature growth or fabrication steps changes the distribution of Ge fraction and stress, and increases atomic intermixing, which degrades device performance. It is of technological importance to study Si-Ge interdiffusion behaviours and build accurate Si-Ge interdiffusivity models. In this work, three aspects of Si-Ge interdiffusion behaviours were investigated both by experiments and by theoretical analysis. 1) Based on the correlation between self-diffusivity, intrinsic diffusivity and interdiffusivity in binary alloy systems, a unified interdiffusivity model was built over the full Ge fraction range. It provides a zero-strain, no-dopant-effect, and low-dislocation-density reference for studies of more impacting factors. This model was then validated with literature data and our experimental data using different annealing techniques. Next, with the well-established reference, the impact of biaxial compressive strain on Si-Ge interdiffusion was further investigated under two specific strain scenarios: with full coherent strain and with partial strain. 2) Complete theoretical analysis was presented to address the compressive strain’s role in Si-Ge interdiffusion. The role of compressive strain was modeled in two aspects: a) strain energy contributes to the interdiffusion driving force; b) the strain derivative q' of interdiffusivity, reflecting the strain-induced changes of both prefactor and activation energy. For the temperature range (720 °C to 880 °C) and Ge fraction range (0.36 to 0.75), a temperature dependence of the strain derivative q', q'=-0.081T+110 eV/unit strain, was reported in Si-Ge interdiffusion. 3) For the case with partial strain, the apparent interdiffusivity model developed for the case with full coherent strain in 2) was modified to reflect strain change, and it was then validated with experimental data. In summary, a set of interdiffusivity models were established based on experimental data and theoretical analysis for three strain scenarios. These models can be employed to predict the thermal stability of SiGe heterostructures, and optimize the design of SiGe structures and of thermal budgets for next-generation SiGe based devices.<br>Applied Science, Faculty of<br>Materials Engineering, Department of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
9

Malla, Prakash, and Bhupendra Gurung. "Adaptation of Software Testability Concept for Test Suite Generation : A systematic review." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4322.

Full text
Abstract:
Context: Software testability, which is the degree to which a software artifact facilitates process of testing, is not only the indication of the test process effectiveness but also gives the new perspective on code development. Since more than fifty percent of total software development costs is related to testing process activities, Software testability has always been the improving area in software domain so that we can make the software development process effective with respect to test cases writing and fault detection process. Objectives: The research though this thesis will have the objective of proposing a conceptual framework considering the testability issues for the simpler test suite generation and facilitating the concerned persons with better effectiveness of testing. We investigate the testability factors and testability metrics basically with the help of the systematic literature review and the proposed framework’s feasibility is evaluated with case study. Methods: Initially, we conduct the literature review to get broad knowledge on this domain as well for the key documents. Then study starts with the systematic literature review process guided by the review protocol to collect the testability factors and measurements. The framework is validated with the case study. The research documents are included from highly trusted e-database including Compendex, Inspec, IEEE Xplore, ACM Digital Library, Springer Link and Scopus. Altogether 36 primary documents are included for the study and results are extracted. Results: From the results of systematic literature review, Software testability factors and associated measurements are found and the construction of framework for simple test generation as guidelines evaluate with case study. To make the test suite generation simpler, we propped a framework based on the FTA concepts and breakdown of high level testability factors to its simpler form of measureable level. Conclusions: Numbers of different software testability factors are presented in different researches in different perspectives. We collect important testability factors and associated measurement methods and we concluded the effect of testability in simpler test suite generation with the help of framework evaluated by case study.
APA, Harvard, Vancouver, ISO, and other styles
10

Heslop, Janelle Nicole. "A systematic approach for assessing next generation technologies and solutions in biomanufacturing." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122586.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, 2019, In conjunction with the Leaders for Global Operations Program at MIT<br>Thesis: S.M., Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, 2019, In conjunction with the Leaders for Global Operations Program at MIT<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 92-93).<br>Amgen is one of the world's leading independent biotechnology companies and competes globally to advance important medicines in a highly competitive marketplace. Biologics manufacturers such as Amgen have traditionally invested in costly, large-scale stainless steel infrastructure to support the production of biologic medication. However, more recently, changes in the economics, such as the need to deploy less-capital intensive biomanufacturing plants faster, and advances in the technology, such as process-intensification (i.e., getting more protein from each cell), have created both incentives and pressures for smaller-scale, single-use, and modular production technologies. These incentives include greater flexibility, shorter timelines for construction / rapid deployment of new facilities, and reduced costs as well as physical and environmental footprint.<br>To prepare for this changing business environment, Amgen must develop a manufacturing strategy that can enable the production of high quality products with significant reduction in timelines, cost, and reduced impact. To do so, Amgen is investigating a handful of these new production technologies, known as next generation manufacturing technologies, and attempting to understand their applicability in their future manufacturing model. There is a need for a transparent and standard methodology for evaluating and deploying new technologies in the manufacturing network. This study aims to address this issue and enable speed, rigor, and efficiency of decisionmaking through the use of a structured framework for selection and deployment of next generation technologies. Through literature review and engagement with Amgen experts, this study defines a next generation manufacturing technology evaluation framework.<br>This framework involves a hybrid, multi-attribute set of metrics that are broadly categorized into economic, environmental, and operational assessment areas. The framework is then applied to assess the economic, operational, and environmental implications of deploying single use technologies in drug substance manufacturing as a test of concept. An assessment along the three areas helps to identify that single use technologies, namely single use bags due to their cost and environmental footprint, may not always be the optimum substitute for all existing process technology. Instead, a hybrid approach, mixing new single use technology with existing stainless steel infrastructure, may help to reduce variable cost and carbon footprint of the process.<br>When the framework and this proposed hybrid approach was at an Amgen site, a potential savings of up to $ 1 M per year was identified as well as the elimination of up to thousands of liters in clean water losses, and up to 400x reduction in the carbon footprint of the process. Lastly, the assessment framework is applied as a management tool in the assessment of next generation drug product filling technology to demonstrate how the framework can be used to enable rapid decision-making related to future manufacturing scenarios.<br>by Janelle Nicole Heslop.<br>M.B.A.<br>S.M.<br>M.B.A. Massachusetts Institute of Technology, Sloan School of Management<br>S.M. Massachusetts Institute of Technology, Department of Civil and Environmental Engineering
APA, Harvard, Vancouver, ISO, and other styles
11

Bari, Mohammed A. "A distributed conceptual model for stream salinity generation processes : a systematic data-based approach." University of Western Australia. School of Earth and Geographical Sciences, 2006. http://theses.library.uwa.edu.au/adt-WU2006.0058.

Full text
Abstract:
[Truncated abstract] During the last fifty years mathematical models of catchment hydrology have been widely developed and used for hydrologic forecasting, design and water resources management. Most of these models need large numbers of parameters to represent the flow generation process. The model parameters are estimated through calibration techniques and often lead to ‘unrealistic’ values due to structural error in the model formulations. This thesis presents a new strategy for developing catchment hydrology models for representing streamflow and salinity generation processes. The strategy seeks to ‘learn from data’ in order to specify a conceptual framework that is appropriate for the particular space and time scale under consideration. Initially, the conceptual framework is developed by considering large space and time scales. The space and time scales are then progressively reduced and conceptual model complexity systematically increased until ultimately, an adequate simulation of daily streamflow and salinity is achieved. This strategy leads to identification of a few key physically meaningful parameters, most of which can be estimated a priori and with minimal or no calibration. Initially, the annual streamflow data from ten experimental catchments (control and cleared for agriculture) were analysed. The streamflow increased in two phases: (i) immediately after clearing due to reduced evapotranspiration, and (ii) through an increase in stream zone saturated area. The annual evapotranspiration losses from native vegetation and pasture, the ‘excess’ water (resulting from reduced transpiration after land use change), runoff and deep storage were estimated by a simple water balance model. The model parameters are obtained a priori without calibration. The annual model was then elaborated by analysing the monthly rainfall-runoff, groundwater and soil moisture data from four experimental catchments. Ernies (control, fully forested) and Lemon (53% cleared) catchments are located in zone with a mean annual rainfall of 725 mm. Salmon (control, fully forested) and Wights (100% cleared) are located in zone with mean annual rainfall of 1125 mm. Groundwater levels rose and the stream zone saturated area increased significantly after clearing. From analysis of this data it was evident that at a monthly time step the conceptual model framework needed to include a systematic gain/loss to storage component in order to adequately describe the observed lags between peak monthly rainfall and runoff.
APA, Harvard, Vancouver, ISO, and other styles
12

Silzle, Andreas. "Generation of quality taxonomies for auditory virtual environments by means of systematic expert survey." Aachen Shaker, 2007. http://d-nb.info/987833790/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Bari, Mohammed A. "A distributed conceptual model for stream salinity generation processes : a systematic data-based approach /." Connect to this title, 2005. http://theses.library.uwa.edu.au/adt-WU2006.0058.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Fan, Hiu-yan, and 樊曉欣. "Economic evaluation of the second generation pneumococcal conjugate vaccine in children : a systematic review." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/206903.

Full text
Abstract:
Background Pneumococcal disease, caused by Streptococcus pneumoniae (S. pneumoniae), leads to a great burden of morbidity and mortality globally, especially in developing countries. World Health Organization (WHO) estimated that 476,000 out of 8.8 million global annual deaths in children under 5 years old in 2008 were due to pneumococcal infection. Currently there are 2 second generation pneumococcal conjugate vaccines (PCVs) targeted at children, the 10-valent pneumococcal conjugate vaccine (PCV-10) and 13-valent pneumococcal conjugate vaccine (PCV-13) available in the market for the prevention of pneumococcal disease. Nowadays, about half of the countries already included PCVs into their National Immunization Programme (NIP) and around one-fourth are planning the introduction. The objective of this systematic review is to evaluate the cost-effectiveness of PCV-10 and PCV-13 so that the results could inform policy decisions of including PCVs into the NIP. Methods A systematic review was conducted by searching from 2 databases (PubMed and Medline) for the economic evaluation studies of the PCV-10 and PCV-13. Information of the design and characteristics of studies, burden of pneumococcal disease assumption, and baseline vaccine efficacy assumptions were extracted and results were presented in incremental cost-effectiveness ratio (ICER). Results Eleven studies were included, with 4 studies done in Europe, 3 in South America, 2 in Africa, 1 in Asia and 1 across North America and Europe. The results varied greatly among studies, with 5 of them reporting PCV-10 to be more cost-effective and/or cost-saving, while 4 of them reporting PCV-13 to be more cost-effective and/or cost-saving, and 2 of them concluded in a different way: PCV-10 was more cost-effective and cost-saving, however PCV-13 would lead to higher life-years gained (LYG) and/or disability-adjusted life years (DALYs) averted. Conclusion Due to the uncertainties in the clinical and epidemiological parameters, the unavailability of the data of local disease burden, and the analytical choices about endpoints which could significantly affect the input data, the results of the studies reviewed were contrasting from each other. Therefore, there was not enough evidence to show whether PCV-10 or PCV-13 was more cost-effective to be included into the NIP of children. Further research should be done on the sensitive variables of the cost-effectiveness ratio, as well as the local serotype distribution and disease burden should also be taken into account when planning the inclusion of PCVs into the NIP.<br>published_or_final_version<br>Public Health<br>Master<br>Master of Public Health
APA, Harvard, Vancouver, ISO, and other styles
15

Kurmaku, Ted, and Musa Kumrija. "A SYSTEMATIC LITERATURE REVIEW AND META-ANALYSIS COMPARING AUTOMATED TEST GENERATION AND MANUAL TESTING." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-48815.

Full text
Abstract:
Software testing is among the most critical parts of the software development process. The creation of tests plays a substantial role in the evaluation of software quality yet being one of the most expensive tasks in software development. This process typically involves intensive manual efforts and it is one of the most labor-intensive steps during software testing. To reduce manual efforts, automated test generation has been proposed as a method of creating tests more efficiently. In recent decades, several approaches and tools have been proposed in the scientific literature to automate the test generation. Yet, how these automated approaches and tools compare to or complement manually written is still an open research question that has been tackled by some software researchers in different experiments. In the light of the potential benefits of automated test generation in practice, its long history, and the apparent lack of summative evidence supporting its use, the present study aimed to systematically review the current body of peer-reviewed publications on the comparison between automated test generation and manual test design. We conducted a systematic literature review and meta-analysis for collecting data from studies comparing manually written tests with automatically generated ones in terms of test efficiency and effectiveness metrics as they are reported. We used a set of primary studies to collect the necessary evidence for analyzing the gathered experimental data. The overall results of the literature review suggest that automated test generation outperforms manual testing in terms of testing time, test coverage, and the number of tests generated and executed. Nevertheless, manually written tests achieve a higher mutation score and they prove to be highly effective in terms of fault detection. Moreover, manual tests are more readable compared to the automatically generated tests and can detect more special test scenarios that the ones created by human subjects. Our results suggest that just a few studies report specific statistics (e.g., effect sizes) that can be used in a proper meta-analysis. The results of this subset of studies suggest rather different results than the ones obtained from our literature review, with manual tests being better in terms of mutation score, branch coverage, and the number of tests executed. The results of this meta-analysis are inconclusive due to the lack of sufficient statistical data and power that can be used for a meta-analysis in this comparison. More primary studies are needed to bring more evidence on the advantages and disadvantages of using automated test generation over manual testing.
APA, Harvard, Vancouver, ISO, and other styles
16

Spadola, Sara. "Systematic investigation of automated plan generation for breast cancer including beam angle and isocenter selection." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amslaurea.unibo.it/5601/.

Full text
Abstract:
Il tumore al seno è il più comune tra le donne nel mondo. La radioterapia è comunemente usata dopo la chirurgia per distruggere eventuali cellule maligne rimaste nel volume del seno. Nei trattamenti di radioterapia bisogna cercare di irradiare il volume da curare limitando contemporaneamente la tossicità nei tessuti sani. In clinica i parametri che definiscono il piano di trattamento radioterapeutico sono selezionati manualmente utilizzando un software di simulazione per trattamenti. Questo processo, detto di trial and error, in cui i differenti parametri vengono modificati e il trattamento viene simulato nuovamente e valutato, può richiedere molte iterazioni rendendolo dispendioso in termini di tempo. Lo studio presentato in questa tesi si concentra sulla generazione automatica di piani di trattamento per irradiare l'intero volume del seno utilizzando due fasci approssimativamente opposti e tangenti al paziente. In particolare ci siamo concentrati sulla selezione delle direzioni dei fasci e la posizione dell'isocentro. A questo scopo, è stato investigata l'efficacia di un approccio combinatorio, nel quale sono stati generati un elevato numero di possibili piani di trattamento utilizzando differenti combinazioni delle direzioni dei due fasci. L'intensità del profilo dei fasci viene ottimizzata automaticamente da un algoritmo, chiamato iCycle, sviluppato nel ospedale Erasmus MC di Rotterdam. Inizialmente tra tutti i possibili piani di trattamento generati solo un sottogruppo viene selezionato, avente buone caratteristiche per quel che riguarda l'irraggiamento del volume del seno malato. Dopo di che i piani che mostrano caratteristiche ottimali per la salvaguardia degli organi a rischio (cuore, polmoni e seno controlaterale) vengono considerati. Questi piani di trattamento sono matematicamente equivalenti quindi per selezionare tra questi il piano migliore è stata utilizzata una somma pesata dove i pesi sono stati regolati per ottenere in media piani che abbiano caratteristiche simili ai piani di trattamento approvati in clinica. Questo metodo in confronto al processo manuale oltre a ridurre considerevol-mente il tempo di generazione di un piano di trattamento garantisce anche i piani selezionati abbiano caratteristiche ottimali nel preservare gli organi a rischio. Inizialmente è stato utilizzato l'isocentro scelto in clinica dal tecnico. Nella parte finale dello studio l'importanza dell'isocentro è stata valutata; ne è risultato che almeno per un sottogruppo di pazienti la posizione dell'isocentro può dare un importante contributo alla qualità del piano di trattamento e quindi potrebbe essere un ulteriore parametro da ottimizzare.
APA, Harvard, Vancouver, ISO, and other styles
17

Veras, Richard Michael. "A Systematic Approach for Obtaining Performance on Matrix-Like Operations." Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/1011.

Full text
Abstract:
Scientific Computation provides a critical role in the scientific process because it allows us ask complex queries and test predictions that would otherwise be unfeasible to perform experimentally. Because of its power, Scientific Computing has helped drive advances in many fields ranging from Engineering and Physics to Biology and Sociology to Economics and Drug Development and even to Machine Learning and Artificial Intelligence. Common among these domains is the desire for timely computational results, thus a considerable amount of human expert effort is spent towards obtaining performance for these scientific codes. However, this is no easy task because each of these domains present their own unique set of challenges to software developers, such as domain specific operations, structurally complex data and ever-growing datasets. Compounding these problems are the myriads of constantly changing, complex and unique hardware platforms that an expert must target. Unfortunately, an expert is typically forced to reproduce their effort across multiple problem domains and hardware platforms. In this thesis, we demonstrate the automatic generation of expert level high-performance scientific codes for Dense Linear Algebra (DLA), Structured Mesh (Stencil), Sparse Linear Algebra and Graph Analytic. In particular, this thesis seeks to address the issue of obtaining performance on many complex platforms for a certain class of matrix-like operations that span across many scientific, engineering and social fields. We do this by automating a method used for obtaining high performance in DLA and extending it to structured, sparse and scale-free domains. We argue that it is through the use of the underlying structure found in the data from these domains that enables this process. Thus, obtaining performance for most operations does not occur in isolation of the data being operated on, but instead depends significantly on the structure of the data.
APA, Harvard, Vancouver, ISO, and other styles
18

Sonenthal, Nechama. "Systematic Review and Meta‐Analysis of Clinical Outcomes of Fractures Fixed with the Surgical Implant Generation Network (SIGN) Intramedullary Nail." Thesis, The University of Arizona, 2017. http://hdl.handle.net/10150/623535.

Full text
Abstract:
A Thesis submitted to The University of Arizona College of Medicine - Phoenix in partial fulfillment of the requirements for the Degree of Doctor of Medicine.<br>The (Surgical Implant Generation Network) SIGN Intramedullary (IM) nail is designed to fix long bone fractures without using a costly C‐arm imaging device. It is distributed for free to countries in need, allowing for elevation of care from the standard, lengthy traction treatment in those countries to clinically superior IM nailing. This paper compares the clinical outcomes of the SIGN IM nail to those of the IM nails used in developed countries with use of a C‐arm. The terms “Surgical Implant Generation Network” and “union” were searched in four databases. Primary studies of SIGN IM nails were included and their outcomes, including union rate, time to union, and complications, were recorded and compared to historical data of IM nails used in developed countries. Overall, there is a similar union rate in bones fixed with SIGN IM nails (94.6%) versus bones fixed with IM nails in developed countries (92.3%) (p = 0.009, OR = 1.67), while some bone types (tibia and femur) demonstrated a lower union rate when individually stratified (p = 0.008, OR = 0.26 and p = 0.002 and OR = 0.15, respectively). Mean time to union for all bone types combined showed no significant difference between SIGN IM nails and IM nails used in developed countries (p = 0.26). Complications rates were similar between SIGN IM nails and IM nails used in developed countries. It is possible for the SIGN IM nail to be used to fix long bone fractures in developing countries with outcomes comparable to the IM nail used in developed countries.
APA, Harvard, Vancouver, ISO, and other styles
19

Silzle, Andreas [Verfasser]. "Generation of Quality Taxonomies for Auditory Virtual Environments by Means of Systematic Expert Survey / Andreas Silzle." Aachen : Shaker, 2008. http://d-nb.info/116131265X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Gustafsson, Daniel. "The Systematic Development Process Applied on a Cab Rotation Unit : Pre-study, concept generation, embodiment design, material selection and optimization." Thesis, Karlstads universitet, Fakulteten för hälsa, natur- och teknikvetenskap (from 2013), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-68651.

Full text
Abstract:
This master thesis studies and applies the systematic development process. The process is initially described in general, creating a template for the process, and later on applied on a real case scenario to show the performance. Finally eventual advantages, drawbacks and suggestions for future improvements are given. The systematic development approach has been performed at Laxå Special Vehicles, who produce truck cabs and special truck chassis for Scania CV AB. The project has focused on the cabs, i.e. the Crew Cabs and the Low Entry. Crew Cabs are extended normal truck cabs, containing four doors to make additional passengers possible, suitable for fire trucks etc. Low Entry is a lowered normal truck cab, lowering the approaching height, making this cab type suitable for city applicable usage where the driver or passengers enter and leave the cab frequently. The task given was to develop the current cab rotation unit to be able to handle both cabs, which from the beginning only could handle the Crew Cabs, called CC28 and CC31. The major goal of this project has been to enable rotation of the Low Entry too. Five phases – pre-study, concept generation, embodiment design, material selection and optimization – were carried out. The pre-study generated a fundamental base of knowledge, according to both the systematic development process and information about the tilt. The concept generation contained a problem degradation, generation of possible solutions and finally an evaluation of these. During the embodiment design the best suited concept was described and developed in detail to allow a suitable material to be selected during the material selection phase. The optimization process consisted of investigating properties according to mechanical strength and stiffness. Two construction solutions to accommodate the mounting points height and length difference between the Crew Cab and the Low Entry were developed. These were a covering plate, called K4, and a mounting plate, called K100, handling the problems occurring for length and height respective. The development process is thus considered to be well operating. It generated a useful result, although possibilities for further improvements exists.<br>Denna masteruppsats studerar och förklarar den systematiska utvecklingsprocessen. Processens olika steg beskrivs inledningsvis generellt, för att sedan appliceras på ett reellt fall för att demonstrera genomförandet. Avslutningsvis ges fördelar, nackdelar och eventuella förbättringsförslag på metoden. Projektet genomfördes på Laxå Special Vehicles som producerar hytter och chassin för fordonstillverkaren Scania. Projektet fokuserade på hytterna som kallas Crew Cab och Low Entry, där den först nämnda är en förlängd hytt med fyra dörrar istället för två. Detta ger mer hyttutrymme, plats för fler passagerare och är därför vanlig i tillämpningar som till exempel brandbilar. Low Entry är en tvådörrarshytt vars insteg är lägre än för vanliga tvådörrarshytter, vilket gör den användbara i stadsnära miljöer där passagerare eller förare ofta lämnar och går in i hytten. Uppgiften som skulle lösas, och därmed målet, var att anpassa en rotationsenehet, även kallad tilt, för även kunna rotera LE. Ursprungligen var den endast anpassad för de två hyttvarianterna av Crew Cab, som kallas CC28 och CC31. Arbetet behandlade fem faser – förstudie, konceptgenerering, designspecificering, materialval och optimering – vilka skulle genomföras för att nå ett användbart resultat. Förstudien fokuserade på att erhålla kunskap om den systematiska utvecklingsprocessen, hur denna skulle genomföras, samt information om hur rotationsenheten fungerade. Konceptgenerering innehöll en problemnedbrytning, konceptskapande och utvärdering av de genererade koncepten. Under designspecificeringen gavs det bästa konceptet/koncepten dimensioner och specificerade funktioner för att under materialvalsprocessen erhålla passande material. Under optimeringsfasen genomfördes analysering och optimering, med avseende på styrka och styvhet. Två konstruktionslösningar utvecklades vilka löste var sitt delproblem som var höjd- och längdskillnad för den bakre monteringspunkten mellan Crew Cab och Low Entry. En omgjord monteringsplatta visade sig lösa höjdskillnaden bäst, kallad K100. Längdskillnaden togs om hand genom att applicera en längre glidskena som skulle täckas av luckor, kallade K4. Eftersom ett väl fungerande resultat erhållits visade den systematiska utvecklingsprocessen sig fungera som efterfrågat men med förbättringspotential.
APA, Harvard, Vancouver, ISO, and other styles
21

Buschmann, Tilo. "The Systematic Design and Application of Robust DNA Barcodes." Doctoral thesis, Universitätsbibliothek Leipzig, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-209812.

Full text
Abstract:
High-throughput sequencing technologies are improving in quality, capacity, and costs, providing versatile applications in DNA and RNA research. For small genomes or fraction of larger genomes, DNA samples can be mixed and loaded together on the same sequencing track. This so-called multiplexing approach relies on a specific DNA tag, index, or barcode that is attached to the sequencing or amplification primer and hence accompanies every read. After sequencing, each sample read is identified on the basis of the respective barcode sequence. Alterations of DNA barcodes during synthesis, primer ligation, DNA amplification, or sequencing may lead to incorrect sample identification unless the error is revealed and corrected. This can be accomplished by implementing error correcting algorithms and codes. This barcoding strategy increases the total number of correctly identified samples, thus improving overall sequencing efficiency. Two popular sets of error-correcting codes are Hamming codes and codes based on the Levenshtein distance. Levenshtein-based codes operate only on words of known length. Since a DNA sequence with an embedded barcode is essentially one continuous long word, application of the classical Levenshtein algorithm is problematic. In this thesis we demonstrate the decreased error correction capability of Levenshtein-based codes in a DNA context and suggest an adaptation of Levenshtein-based codes that is proven of efficiently correcting nucleotide errors in DNA sequences. In our adaptation, we take any DNA context into account and impose more strict rules for the selection of barcode sets. In simulations we show the superior error correction capability of the new method compared to traditional Levenshtein and Hamming based codes in the presence of multiple errors. We present an adaptation of Levenshtein-based codes to DNA contexts capable of guaranteed correction of a pre-defined number of insertion, deletion, and substitution mutations. Our improved method is additionally capable of correcting on average more random mutations than traditional Levenshtein-based or Hamming codes. As part of this work we prepared software for the flexible generation of DNA codes based on our new approach. To adapt codes to specific experimental conditions, the user can customize sequence filtering, the number of correctable mutations and barcode length for highest performance. However, not every platform is susceptible to a large number of both indel and substitution errors. The Illumina “Sequencing by Synthesis” platform shows a very large number of substitution errors as well as a very specific shift of the read that results in inserted and deleted bases at the 5’-end and the 3’-end (which we call phaseshifts). We argue in this scenario that the application of Sequence-Levenshtein-based codes is not efficient because it aims for a category of errors that barely occurs on this platform, which reduces the code size needlessly. As a solution, we propose the “Phaseshift distance” that exclusively supports the correction of substitutions and phaseshifts. Additionally, we enable the correction of arbitrary combinations of substitution and phaseshift errors. Thus, we address the lopsided number of substitutions compared to phaseshifts on the Illumina platform. To compare codes based on the Phaseshift distance to Hamming Codes as well as codes based on the Sequence-Levenshtein distance, we simulated an experimental scenario based on the error pattern we identified on the Illumina platform. Furthermore, we generated a large number of different sets of DNA barcodes using the Phaseshift distance and compared codes of different lengths and error correction capabilities. We found that codes based on the Phaseshift distance can correct a number of errors comparable to codes based on the Sequence-Levenshtein distance while offering the number of DNA barcodes comparable to Hamming codes. Thus, codes based on the Phaseshift distance show a higher efficiency in the targeted scenario. In some cases (e.g., with PacBio SMRT in Continuous Long Read mode), the position of the barcode and DNA context is not well defined. Many reads start inside the genomic insert so that adjacent primers might be missed. The matter is further complicated by coincidental similarities between barcode sequences and reference DNA. Therefore, a robust strategy is required in order to detect barcoded reads and avoid a large number of false positives or negatives. For mass inference problems such as this one, false discovery rate (FDR) methods are powerful and balanced solutions. Since existing FDR methods cannot be applied to this particular problem, we present an adapted FDR method that is suitable for the detection of barcoded reads as well as suggest possible improvements.
APA, Harvard, Vancouver, ISO, and other styles
22

Qiao, Junhua [Verfasser], and J. [Akademischer Betreuer] Bode. "Systematic Study on Generation of Mammalian Production Cell Lines by Targeted Integration (RMCE) / Junhua Qiao ; Betreuer: J. Bode." Braunschweig : Technische Universität Braunschweig, 2009. http://d-nb.info/1175828807/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

De, Silva Jayasekera Varthula Janya. "Systematic Generation of Lack-of-Fusion Defects for Effects of Defects Studies in Laser Powder Bed Fusion AlSi10Mg." Youngstown State University / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1598531488781737.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Arthuis, Pierre. "Bogoliubov Many-Body Perturbation Theory for Nuclei : Systematic Generation and Evaluation of Diagrams and First ab initio Calculations." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS304/document.

Full text
Abstract:
Les dernières décennies ont donné lieu à un développement rapide des théories ab initio visant à décrire les propriétés des noyaux à partir de l'interaction nucléonique. Un tel développement a été rendu possible à la fois par la très importante croissance de la puissance de calcul et de nouveaux développements formels. Le présent travail se consacre au développement de la théorie de perturbation à N corps de Bogolioubov récemment proposée, qui repose sur l'usage d'un état de référence brisant la symétrie associée au nombre de particules pour permettre une description des noyaux à simple couche ouverte. Le formalisme est tout d'abord décrit en détails, son lien avec la théorie de perturbation à N corps standard est établi, tout comme sa connexion avec la théorie de cluster couplés de Bogolioubov. L'extension du formalisme à des ordres plus élevés à partir de méthodes de théorie des graphes est ensuite présentée ainsi que le programme ADG qui génère et évalue les diagrammes BMBPT à un ordre quelconque. Les implications de ce développement formel dépassent le cadre du présent travail, les méthodes développées pouvant être appliqués à d’autres méthodes à N corps. Pour terminer, de premiers résultats numériques pour les isotopes de l'oxygène, du calcium et du nickel sont présentés. Ces résultats établissent la théorie de perturbation à N corps de Bogolioubov comme une méthode de premier intérêt pour des calculs à grande échelle sur les chaînes isotopiques et isotoniques de masse moyenne<br>The last few decades in nuclear structure theory have seen a rapid expansion of ab initio theories, aiming at describing the properties of nuclei starting from the inter-nucleonic interaction. Such an expansion relied both on the tremendous growth of computing power and novel formal developments. This work focuses on the development of the recently proposed Bogoliubov Many-Body Perturbation Theory that relies on a particle-number-breaking reference state to tackle singly open-shell nuclei. The formalism is first described in details, and diagrammatic and algebraic contributions are derived up to second order. Its link to standard Many-Body Perturbation Theory is made explicit, as well as its connexion to Bogoliubov Coupled-Cluster theory. An automated extension to higher orders based on graph theory methods is then detailed, and the ADG numerical program generating and evaluating BMBPT diagrams at arbitrary order is introduced. Such a formal development carries implications that are not restricted to the present work, as the developed methods can be applied to other many-body methods. Finally, first numerical results obtained for oxygen, calcium and nickel isotopes are presented. They establish BMBPT as a method of interest for large-scale computations of isotopic or isotonic chains in the mid-mass sector of the nuclear chart
APA, Harvard, Vancouver, ISO, and other styles
25

Bourque, François. "The risk for schizophrenia and related disorders among first-and second-generation migrants: a systematic review and meta-analysis." Thesis, McGill University, 2010. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=86802.

Full text
Abstract:
Background: Migration is known as a risk factor for schizophrenia and related disorders, but the magnitude of the risk in second-generation migrants is unclear. This study aims at determining the risk of psychosis in first- and second-generation migrants and exploring sources of variation.<br>Methods: A systematic review of population-based incidence studies of psychosis among first- and second-generation migrants was conducted. Descriptive and meta-analytic syntheses of identified studies were performed and sources of heterogeneity were examined.<br>Results: Nearly all migrant groups were at increased risk for psychotic disorders. The magnitude of the risk was similar in first- and second-generation migrants, but varied considerably according to ethno-racial status, social contexts and methodological variables.<br>Discussion: The risk clearly persists into the second generation, indicating that post-migration factors are more important than pre-migration factors or migration per se. The observed variability suggests that socio-environmental determinants contribute to the onset of psychotic disorders.<br>Contexte: L'immigration est associée à un risque accrû de troubles psychotiques, mais le doute persiste quant au risque chez les immigrants de deuxième génération demeure. Cette étude vise à évaluer le risque de psychoses des immigrants de première et deuxième génération et à en explorer la variabilité.<br>Méthode: Une revue systématique des études d'incidence de psychoses chez les immigrants de première et deuxième génération a été menée. Des synthèses descriptives et méta-analytiques des études ont été complétées. Les sources d'hétérogénéité ont été examinées.<br>Résultats : Presque tous les groupes d'immigrants ont un risque accrû de développer des troubles psychotiques. Le risque est comparable pour les deux générations, mais son ampleur varie considérablement selon le statut ethno-racial, le contexte social et la méthodologie.<br>Discussion : La persistance du risque dans la deuxième génération indique que les facteurs post-migratoires sont plus influents que les facteurs pré-migratoires ou la migration. La variabilité observée suggère que l'environnement social contribue au développement des troubles psychotiques.
APA, Harvard, Vancouver, ISO, and other styles
26

Messer, Matthias. "A systematic approach for integrated product, materials, and design-process design." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/22608.

Full text
Abstract:
Thesis (Ph. D.)--Mechanical Engineering, Georgia Institute of Technology, 2008.<br>Committee Chair: Allen, Janet K.; Committee Member: Aidun, Cyrus K.; Committee Member: Klein, Benjamin; Committee Member: McDowell, David L.; Committee Member: Mistree, Farrokh; Committee Member: Yoder, Douglas P.
APA, Harvard, Vancouver, ISO, and other styles
27

Mamun, Md Abdullah Al, and Aklima Khanam. "Concurrent Software Testing : A Systematic Review and an Evaluation of Static Analysis Tools." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4310.

Full text
Abstract:
Verification and validation is one of the most important concerns in the area of software engineering towards more reliable software development. Hence it is important to overcome the challenges of testing concurrent programs. The extensive use of concurrent systems warrants more attention to the concurrent software testing. For testing concurrent software, automatic tools development is getting increased focus. The first part of this study presents a systematic review that aims to explore the state-of-the-art of concurrent software testing. The systematic review reports several issues like concurrent software characteristics, bugs, testing techniques and tools, test case generation techniques and tools, and benchmarks developed for the tools. The second part presents the evaluation of four commercial and open source static analysis tools detecting Java multithreaded bugs. An empirical evaluation of the tools would help the industry as well as the academia to learn more about the effectiveness of the static analysis tools for concurrency bugs.
APA, Harvard, Vancouver, ISO, and other styles
28

Oliveira, André Luiz de. "A model-based approach to support the systematic reuse and generation of safety artefacts in safety-critical software product line engineering." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-22112016-161607/.

Full text
Abstract:
Software Product Line Engineering (SPLE) has been proven to reduce development and maintenance costs, improving the time-to-market, and increasing the quality of product variants developed from a product family via systematic reuse of its core assets. SPLE has been successfully used in the development of safety-critical systems, especially in automotive and aerospace domains. Safety-critical systems have to be developed according to safety standards, which demands safety analysis, Fault Tree Analysis (FTA), and assurance cases safety engineering artefacts. However, performing safety analysis, FTA, and assurance case construction activities from scratch and manually for each product variant is time-consuming and error-prone, whereas variability in safety engineering artefacts can be automatically managed with the support of variant management techniques. As safety is context-dependent, context and design variation directly impact in the safety properties changing hazards, their causes, the risks posed by these hazards to system safety, risk mitigation measures, and FTA results. Therefore, managing variability in safety artefacts from different levels of abstraction increases the complexity of the variability model, even with the support of variant management techniques. To achieve an effective balance between benefits and complexity in adopting an SPLE approach for safety-critical systems it is necessary to distinguish between reusable safety artefacts, whose variability should be managed, and those that should be generated from the reused safety artefacts. On the other hand, both industry and safety standards have recognized the use of model-based techniques to support safety analysis and assurance cases. Compositional safety analysis, design optimization, and model-based assurance cases are examples of techniques that have been used to support the generation of safety artefacts required to achieve safety certification. This thesis aims to propose a model-based approach that integrates model-based development, compositional safety analysis, and variant management techniques to support the systematic reuse and generation of safety artefacts in safety-critical software product line engineering. The approach contributes to reduce the effort and costs of performing safety analysis and assessment for a particular product variant, since such analysis is performed from the reused safety artefacts. Thus, variant-specific fault trees, Failure Modes and Effects Analysis (FMEA), and assurance case artefacts required to achieve safety certification can be automatically generated with the support the model-based safety analysis and assurance case construction techniques.<br>Engenharia de Linha de Produtos de Software (ELPS) contribui para a redução dos custos de desenvolvimento e de manutenção, a melhoria do time-to-market, e o aumento da qualidade de produtos desenvolvidos a partir de uma família de produtos por meio do reuso sistemático dos ativos principais da linha de produtos. A ELPS vem sendo utilizada com sucesso no desenvolvimento de sistemas embarcados críticos, especificamente nos domínios de sistemas automotivos e aeroespaciais. Sistemas embarcados críticos devem ser desenvolvidos de acordo com os requisitos definidos em padrões de segurança, que demandam a produção de artefatos de análise de segurança, árvores de falhas e casos de segurança. Entretanto, a realização de atividades de análise de segurança, análise de árvores de falhas e construção de casos de segurança de forma manual para cada produto de uma linha de produtos é uma tarefa demorada e propensa a erros. O gerenciamento de variabilidade em artefatos de análise de segurança pode ser automatizado com o apoio de técnicas de gerenciamento de variabilidades. Em virtude de safety ser uma propriedade dependente de contexto, a variabilidade no projeto e contexto inerente uma linha de produtos software impacta na definição de propriedades de segurança do sistema, modificando as ameaças à segurança do sistema, suas causas e riscos, medidas de mitigação aplicáveis, e resultados de análise de árvore de falhas. Dessa forma, gerenciar variabilidades em artefatos relacionados à safety em diferentes níveis de abstração aumenta a complexidade do modelo de variabilidade mesmo com o apoio de técnicas de gerenciamento de variabilidades. Para alcançar o equilíbrio eficaz entre os benefícios e a complexidade da adoção de uma abordagem de ELPS para o desenvolvimento de sistemas embarcados críticos é necessário fazer a distinção entre artefatos de safety reusáveis, em que a variabilidade deve ser gerenciada, e artefatos de safety que devem ser gerados a partir de artefatos reusáveis. Por outro lado, tanto a indústria quanto os padrões de segurança têm reconhecido o uso de técnicas dirigidas a modelos para apoiar a análise segurança e a construção de casos de segurança. Técnicas de análise de segurança composicional e otimização de projeto, e de construção de casos de segurança dirigido a modelos vêm sendo utilizadas para apoiar a geração de artefatos de safety requeridos para certificação. O objetivo desta tese é a proposta de uma abordagem dirigida a modelos que integra técnicas de desenvolvimento dirigido a modelos, análise de segurança composicional e otimização de projeto, e construção de casos de segurança dirigido a modelos para apoiar o reuso sistemático e a geração de artefatos de safety em engenharia de linhas de produtos de sistemas embarcados críticos. A abordagem proposta reduz o esforço e os custos de análise e avaliação de segurança para produtos de uma linha de produtos, uma vez que tal análise é realizada a partir de artefatos de safety reusados. Assim, artefatos como análises de árvores de falhas e de modos de falha e efeitos, e casos de segurança requeridos para certificação podem ser gerados automaticamente com o apoio de técnicas dirigidas a modelos.
APA, Harvard, Vancouver, ISO, and other styles
29

Singh, Inderjeet. "A Mapping Study of Automation Support Tools for Unit Testing." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-15192.

Full text
Abstract:
Unit testing is defined as a test activity usually performed by a developer for the purpose of demonstrating program functionality and meeting the requirements specification of module. Nowadays, unit testing is considered as an integral part in the software development cycle. However, performing unit testing by developers is still considered as a major concern because of the time and cost involved in it. Automation support for unit testing, in the form of various automation tools, could significantly lower the cost of performing unit testing phase as well as decrease the time developer involved in the actual testing. The problem is how to choose the most appropriate tool that will suit developer requirements consisting of cost involved, effort needed, level of automation provided, language support, etc. This research work presents results from a systematic literature review with the aim of finding all unit testing tools with an automation support. In the systematic literature review, we initially identified 1957 studies. After performing several removal stages, 112 primary studies were listed and 24 tools identified in total. Along with the list of tools, we also provide the categorization of all the tools found based on the programming language support, availability (License, Open source, Free), testing technique, level of effort required by developer to use tool, target domain, that we consider as good properties for a developer to make a decision on which tool to use. Additionally, we categorized type of error(s) found by some tools, which could be beneficial for a developer when looking at the tool’s effectiveness. The main intent of this report is to aid developers in the process of choosing an appropriate unit testing tool from categorization table of available tools with automation unit testing support that ease this process significantly. This work could be beneficial for researchers considering to evaluate efficiency and effectiveness of each tool and use this information to eventually build a new tool with the same properties as several others.
APA, Harvard, Vancouver, ISO, and other styles
30

Silzle, Andreas. "Generation of quality taxonomies for auditory virtual environments by means of systematic expert survey = Erstellung von Qualitäts-Taxonomien für auditive virtuelle Umgebungen mit Hilfe systematischer Expertenbefragung /." Aachen : Shaker, 2008. http://d-nb.info/987833790/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Delhomme, Tiffany. "Using the systematic nature of errors in NGS data to efficiently detect mutations : computational methods and application to early cancer detection." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSE1098/document.

Full text
Abstract:
La caractérisation exaustive des variations de l'ADN peut aider à progresser dans de nombreux champs liés à la génomique du cancer. Le séquençage nouvelle génération (NGS en anglais pour Next Generation Sequencing) est actuellement la technique la plus efficace pour déterminer une séquence ADN, du aux faibles coûts et durées des expériences comparé à la méthode de séquençage traditionnelle de Sanger. Cependant, la détection de mutations à partir de données NGS reste encore un problème difficile, en particulier pour les mutations somatiques présentes en très faible abondance comme lorsque l'on essaye d'identifier des mutations sous-clonales d'une tumeur, des mutations dérivées de la tumeur dans l'ADN circulant libre, ou des mutations somatiques dans des tissus normaux. La difficulté principale est de précisement distinguer les vraies mutations des artefacts de séquençage du au fait qu'ils atteignent des niveaux similaires. Dans cette thèse nous avons étudié la nature systématique des erreurs dans les données NGS afin de proposer des méthodologies efficaces capables d'identifier des mutations potentiellement en faible abondance. Dans un premier chapitre, nous decrivons needlestack, un nouvel outil d'appel de variants basé sur la modélisation des erreurs systématiques sur plusieurs échantillons pour extraire des mutations candidates. Dans un deuxième chapitre, nous proposons deux méthodes de filtrage des variants basées sur des résumés statistiques et sur de l'apprentissage automatique, dans le but de d'améliorer la précision de la détection des mutations par l'identification des erreurs non-systématiques. Finalement, dans un dernier chapitre nous appliquons ces approches pour développer des biomarqueurs de détection précoce du cancer en utilisant l'ADN circulant tumoral<br>Comprehensive characterization of DNA variations can help to progress in multiple cancer genomics fields. Next Generation Sequencing (NGS) is currently the most efficient technique to determine a DNA sequence, due to low experiment cost and time compared to the traditional Sanger sequencing. Nevertheless, detection of mutations from NGS data is still a difficult problem, in particular for somatic mutations present in very low abundance like when trying to identify tumor subclonal mutations, tumor-derived mutations in cell free DNA, or somatic mutations from histological normal tissue. The main difficulty is to precisely distinguish between true mutations from sequencing artifacts as they reach similar levels. In this thesis we have studied the systematic nature of errors in NGS data to propose efficient methodologies in order to accurately identify mutations potentially in low proportion. In a first chapter, we describe needlestack, a new variant caller based on the modelling of systematic errors across multiple samples to extract candidate mutations. In a second chapter, we propose two post-calling variant filtering methods based on new summary statistics and on machine learning, with the aim of boosting the precision of mutation detection through the identification of non-systematic errors. Finally, in a last chapter we apply these approaches to develop cancer early detection biomarkers using circulating tumor DNA
APA, Harvard, Vancouver, ISO, and other styles
32

Lima, Lisiane Pedroso. "Proposta de um modelo conceitual de referência para o uso integrado de evidências no processo de projeto de edificações." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/172294.

Full text
Abstract:
Existe a necessidade de modificar o processo de projeto diante da crescente complexidade envolvida em empreendimentos da construção. Há envolvimento de muitos profissionais, existência de distintas metas e interesses, além da ampliação do escopo dos projetos desenvolvidos. Além disso, o processo tradicional de projeto é geralmente desenvolvido de forma fragmentada, desconsiderando o conhecimento de vários estudos acadêmicos já desenvolvidos, sendo as decisões tomadas geralmente com base na experiência dos projetistas. Nesse sentido, uma das abordagens que vem sendo usada para melhorar os projetos de edificações é o projeto baseado em evidências (Evidence-Based Design - EBD). EBD é um processo que visa a melhorar as decisões de projeto tendo como base o uso das melhores evidências disponíveis de pesquisa, aliadas à prática profissional e a dados relacionados aos requisitos do cliente. O presente trabalho teve como objetivo desenvolver um modelo conceitual para guiar o uso do EBD no processo de projeto de edificações. A pesquisa foi dividida em três estágios. O primeiro estágio, de caráter exploratório, buscou a compreensão da pesquisadora quanto ao tópico investigado (EBD). No segundo estágio, buscou-se entender a aplicação do EBD no processo de projeto com enfoque na geração de valor Já o terceiro estágio buscou desenvolver formas de disseminação de resultados baseados em evidência. Ao longo dos três estágios, foram realizadas três revisões sistemáticas de literatura e também três estudos empíricos, sendo dois desenvolvidos no contexto de habitação de baixa renda no Brasil, enquanto que o terceiro foi realizado em um empreendimento da área da saúde no Reino Unido. O estudo apresenta tanto contribuições práticas como teóricas. Sob um enfoque prático, a principal contribuição é a possibilidade de integração de evidências existentes no processo de projeto a partir do desenvolvimento de algumas formas de coleta, processamento e análise dessas evidências. Em termos teóricos, este estudo propõe uma nova abordagem conceitual sobre nomenclatura e classificação de evidências para o processo de projeto usando EBD, com foco na geração de valor. É apresentada uma proposta de processo de projeto que busca uma melhor integração entre a prática profissional com conhecimentos produzidos pela academia, por meio de um processo de geração de conhecimento como uma forma de aprendizagem contínua.<br>There is a need for changing the design process due to the growing complexity of construction projects. There is a large number of stakeholders, which have a diversity of aims and goals, in addition to the broader scope of building projects. Moreover, the traditional design process is usually developed in a fragmented way, based mostly on the designers’ experience, often disregarding knowledge from other stakeholders involved. Evidence based design (EBD) is an emerging approach that aims to address this problem by supporting project decision-making with available evidence from research, in addition to professional experience and clients requirements’ data. This research work has proposed a conceptual model to guide the use of EBD in the building design process. The study was divided into three stages. The first stage had an exploratory character, in which the focus of the researcher was on understanding EBD. In the second stage, the aim was to study the application of EBD in the design process, focusing on value generation. The third stage consisted of devising forms of disseminating evidence-based results Three systematic literature reviews and three empirical studies were developed along those three stages. Two studies were concerned with affordable housing projects in Brazil, and one study involved a care home project in the U.K. This work provides both practical and theoretical contributions. From a practical viewpoint, the model proposed herein integrates evidences in building design process and proposes techniques to collect, process, and analyse such evidences. From a theoretical viewpoint, it introduces a new terminology and classification for evidences that can be incorporated in buildings design by using EBD, for increasing value generation. A proposal for a new design process that improves the integration between the professional practice and knowledge produced by academics, through a process for generating knowledge as a form of continuous learning.
APA, Harvard, Vancouver, ISO, and other styles
33

Haydock, Lawrence. "Systematic development of equivalent circuits for synchronous machines." Thesis, Imperial College London, 1985. http://hdl.handle.net/10044/1/8613.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Thalji, Abdullah Abdel-Majeed. "Systematic polysemy in Arabic : a generative lexicon-based account." Thesis, University of Essex, 2018. http://repository.essex.ac.uk/22121/.

Full text
Abstract:
This thesis is the first of its kind to study the (linguistic) phenomenon of systematic polysemy and examine its pervasiveness in Arabic (both Modern Standard Arabic (MSA) and Jordanian Arabic (JA)). Systematic polysemy in this study is defined as the case where a lexeme has more than one distinct sense and the relationship between the senses is predictable by rules in language. In the narrow sense, however, this phenomenon refers only to the productive type of regular polysemy, which is defined vis-à-vis Apresjan’s (1974) notion of totality of scope (e.g. the content/container type). The integral function of this research is to (i) identify the major (as well as the minor) patterns of regular polysemy in Arabic in the major lexical categories of nouns, verbs, and adjectives; (ii) determine the extent to which these patterns converge with or diverge from the already explored patterns, mainly in English; and (iii) test the applicability of Pustejovsky’s (1995) Generative Lexicon (the GL) in accounting for the various Arabic data on polysemy. The study found that nearly every regular polysemous pattern observed in English was also present in Arabic, albeit with a few attested differences. For example, the regular pattern of the mass-to-count alternation (e.g. coffee—a coffee) is very rarely encountered in Arabic. In addition, the animal/meat alternation in English behaves rather differently in Arabic in the way the language elicits a non-countable (mass) meaning from a countable counterpart. With respect to lexicography, this study adds to the already studied patterns in Atkins and Rundell (2008). The dissertation also raises additional questions for the GL framework with respect to property nominalizations, nominalized adjectives, and generic collective nouns.
APA, Harvard, Vancouver, ISO, and other styles
35

Ring, Nicola A. "A critical analysis of evidence-based practice in healthcare : the case of asthma action plans." Thesis, University of Stirling, 2013. http://hdl.handle.net/1893/13061.

Full text
Abstract:
Evidence-based practice is an integral part of multi-disciplinary healthcare, but its routine clinical implementation remains a challenge internationally. Written asthma action plans are an example of sub-optimal evidence-based practice because, despite being recommended, these plans are under-issued by health professionals and under-used by patients/carers. This thesis is a critical analysis of the generation and implementation of evidence in this area and provides fresh insight into this specific theory/practice gap. This submission brings together, in five published papers, a body of work conducted by the candidate. Findings report that known barriers to action plan use (such as a lack of practitioner time) are symptomatic of deeper and more complex underlying factors. In particular, over-reliance on knowledge derived from randomised controlled trials and their systematic review, as the primary and sole source of evidence for healthcare practice, hindered the implementation of these plans. A lack of evidence reflecting the personal experience of using these plans in the real world, rather than in trial settings, contributed to a mismatch between what patients/carers want from asthma action plans and what they are currently being provided with by professionals. This submission illustrates the benefits of utilising a broader range of knowledge as a basis for clinical practice. The presented papers report how new and innovative research methodologies (including meta-ethnography and cross-study synthesis) can be used to synthesise individual studies reporting the personal experiences of patients and professionals and how such findings can then be used to better understand why interventions can be implemented in trial settings rather than everyday practice. Whilst these emerging approaches have great potential to contribute to evidence-based practice by, for example, strengthening the ‘weight’ of experiential knowledge, there are methodological challenges which, whilst acknowledged, have yet to be fully addressed.
APA, Harvard, Vancouver, ISO, and other styles
36

Marthaler, Florian [Verfasser], and A. [Akademischer Betreuer] Albers. "Zukunftsorientierte Produktentwicklung – Eine Systematik zur Ableitung von generationsübergreifenden Zielsystemen zukünftiger Produktgenerationen durch strategische Vorausschau = Future-Oriented Product Development – a Systematic Approach to Deriving Cross-Generational Systems of Objectives of Future Product Generations Through Strategic Foresight / Florian Marthaler ; Betreuer: A. Albers." Karlsruhe : KIT-Bibliothek, 2021. http://d-nb.info/1238147992/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

BONFANTI, Silvia (ORCID:0000-0001-9679-4551). "Rigorous Model-based Development of Programmable Electronic Medical Systems (PEMS): from Requirements to Code." Doctoral thesis, Università degli studi di Bergamo, 2017. http://hdl.handle.net/10446/77230.

Full text
Abstract:
Programmable Electronic Medical Systems (PEMS) are safety-critical system. They have effects on people health and, in case of malfunctions, they can seriously compromise human safety. For this reason, the software installed on these devices must be guaranteed through rigorous processes to assure safety and reliability. Moreover, correct operation of a medical device depends upon the controlling software, whose development should adhere to certification standards. The rigorous process presented in this thesis is based on the Abstract State Machines (ASMs) formal method, a mathematically based technique for the specification, analysis and development of software systems. The ASM formal approach proposes an incremental life cycle model for software development based on model refinement. It covers the main software engineering activities (specification, validation, verification, conformance checking), and it is supported by a wide range of tools which are part of the Asmeta (ASM mETAmodeling) framework. In this thesis, the ASM development approach and its supporting Asmeta framework are used to propose a rigorous development process for PEMS. The final goal is to provide a process able to guarantee the development of correct and controllable systems in a correct and controllable way. The definition of this process has leaded to some improvements of the method, mainly regarding the textual and graphical notations, and the automatic code generation from models. A new rigorous notation, Unified Syntax for Abstract State Machine (UASM), has been defined to provide a stable language kernel for ASMs. Formal models are not widely used in practice, since they are considered difficult to develop and understand. For this reason, we here make a proposal of a tool for a graphical representation of ASM models in order to increase the readability. Moreover, we have devised a methodology to generate the desired source code from ASM models. The tool automatically translates the formal specification into the target code (C++ for Arduino in the present case) and it keeps true the system behavior and the properties verified during validation and verification. The hemodialysis machine and the stereoacuity test are used as real case studies to show the applicability and effectiveness of the ASM-based development process in the area of PEMS.
APA, Harvard, Vancouver, ISO, and other styles
38

Wallace, John. "Generating and communicating the evidence : enhancing the uptake of systematic reviews." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:e2966148-851a-4f00-ab6a-2992fd21c3a3.

Full text
Abstract:
The theme of this project was synthesis and the thesis encompasses knowledge generation and knowledge translation. Systematic review methodology was employed. The initial two systematic reviews compared antidepressant medication and cognitive-behaviour therapy for the acute treatment of depression. A further comparison of a combination of the two interventions with each treatment on its own was also conducted, with the bulk of the evidence favouring the psychotherapy. Moving to the topic of knowledge translation, the main theme of the thesis, the barriers, facilitators, and interventions impacting on systematic review uptake were identified. The evidence from these three systematic reviews, using diverse methodologies, was then combined to identify the interventions that overcame specific obstacles and built on highlighted facilitators in order to improve the uptake of evidence from systematic reviews. Juxtaposing barriers and facilitators alongside effectiveness studies in this final, mixed-methods systematic review allowed a number of interventions to be recommended. The synthesis also allowed strategies to be highlighted that required further development. Interventions with a statistically significant effect such as educational visits, summaries of systematic reviews, and targeted messaging, addressed a wide range of the identified barriers and facilitators. These interventions were recommended. Promising uptake strategies requiring further development were also identified. Furthermore, large gaps in the evidence base regarding systematic review utilization were highlighted. Fewer of the facilitators identified as part of this project, such as the medico-legal protection provided by systematic reviews, appear to have been built on in order to increase review uptake. Finally, all the preceding evidence was drawn on in order to develop a proposal focused on improving the uptake of evidence from systematic reviews and meta-analyses. This doctoral project offers a menu or range of evidence-based factors that can be considered by organisations and researchers when planning strategies aimed at increasing the uptake of pre-appraised, synthesized evidence.
APA, Harvard, Vancouver, ISO, and other styles
39

Petersen, Henry. "Generating High Precision Classification Rules for Screening of Irrelevant Studies in Systematic Review Literature Searches." Thesis, The University of Sydney, 2016. http://hdl.handle.net/2123/15454.

Full text
Abstract:
Systematic reviews aim to produce repeatable, unbiased, and comprehensive answers to clinical questions. Systematic reviews are an essential component of modern evidence based medicine, however due to the risks of omitting relevant research they are highly time consuming to create and are largely conducted manually. This thesis presents a novel framework for partial automation of systematic review literature searches. We exploit the ubiquitous multi-stage screening process by training the classifier using annotations made by reviewers in previous screening stages. Our approach has the benefit of integrating seamlessly with the existing screening process, minimising disruption to users. Ideally, classification models for systematic reviews should be easily interpretable by users. We propose a novel, rule based algorithm for use with our framework. A new approach for identifying redundant associations when generating rules is also presented. The proposed approach to redundancy seeks to both exclude redundant specialisations of existing rules (those with additional terms in their antecedent), as well as redundant generalisations (those with fewer terms in their antecedent). We demonstrate the ability of the proposed approach to improve the usability of the generated rules. The proposed rule based algorithm is evaluated by simulated application to several existing systematic reviews. Workload savings of up to 10% are demonstrated. There is an increasing demand for systematic reviews related to a variety of clinical disciplines, such as diagnosis. We examine reviews of diagnosis and contrast them against more traditional systematic reviews of treatment. We demonstrate existing challenges such as target class heterogeneity and high data imbalance are even more pronounced for this class of reviews. The described algorithm accounts for this by seeking to label subsets of non-relevant studies with high precision, avoiding the need to generate a high recall model of the minority class.
APA, Harvard, Vancouver, ISO, and other styles
40

Kazlauskas, Dainius. "Researches of H2S generation from municipal landfills and systematical evaluation of landfills pollution." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2005. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2005~D_20050614_082554-89323.

Full text
Abstract:
In Lithuania the amount of waste generation is increasing every year. According to national strategy, all wastes should be disposed in new regional landfills. Landfills pollutes environment with leachate and landfill gas and odours. Landfill gas consists of odorous compounds and one of them is hydrogen sulphide (H2S). Hydrogen sulphide is highly toxic and affects the nervous system with low threshold. As the landfill gas and leachate generation was word widely investigated before this work, it is not necessary to provide new researches on them. The measurements of H2S generation were provided in Jerubaiciai landfill. For the measurements was used “site-on” measurement method, measurements were provided with equipment GD/MG 7, in 51 measurement points and 2 monitoring wells, during different seasons of the year. Results of the measurement shows, that amount of H2S varies in different areas of landfill and during different seasons. The results of dispersion modeling achieved with dispersion model AERMOD, provided under calm weather conditions and under wind dominated in that session winter speed and direction, during different seasons of the year shows, that H2S spreads from landfill in longest distances from landfill’s section during summer (almost in distance equal to 2.5 km the H2S concentration is higher then Highest Allowable Concentration ). In autumn and spring this distance is equal to 1.5 km, and in winter – 800 m.<br>Susidarančių komunalinių atliekų kiekis Lietuvoje kiekvienais metais didėja. Pagal nacionalinę strategiją, visos komunalinės atliekos Turi būti deponuojamos regioniniuose sąvartynuose, kurie teršia aplinka filtratu iš sąvartyno išsiskiriančiomis dujomis bei kvapais, kurių veina iš sudedamųjų dalių yra sieros vandenilis (H2S). H2S matavimai buvo atlikti Jerubaičių sąvartyne. Iš sąvartyno išsiskiriantis H2S kiekis buvo tiriamas jo išsiskyrimo vietoje, t.y. sąvartyno teritorijoje. Šis matavimo metodas buvo pasirinktas remiantis tuo, kad iš sąvartyno išsiskiriančios taršos dydis ir poveikis priklauso nuo daugelio aplinkos faktorių. Matavimai, naudojant prietaisą GD/MG 7, buvo atlikti 59 matavimo taškuose ir 2 monitoringo šuliniuose, skirtingais metų laikai. Gauti tyrimų rezultatai parodė, kad šios medžiagos kiekis yra skirtingas įvairiose sąvartyno zonose bei įvairiais metų laikais. Norint ištirti H2S sklaidą buvo atliktas skaitmeninis dispersijos modeliavimas naudojant programą AERMOD. Jo metu vienu atveju buvo pasirinktos stabilios meteorologinės sąlygos, o kitu pasirinkti dominuojančios konkrečiu metų laiku vėjo kryptys ir greičiai. Modeliavimo rezultatai parodė, kad vasarą H2S didžiausia leistina koncentracija pasiekiama tik maždaug 2,5 kilometrų, rudenį ir pavasarį 1,5 kilometrų, o žiemą - už 800 metrų atstumu nuo sąvartyno teritorijos.
APA, Harvard, Vancouver, ISO, and other styles
41

Sproul, John S. "Stoneflies of Unusual Size: Population Genetics and Systematics Within Pteronarcyidae (Plecoptera)." BYU ScholarsArchive, 2012. https://scholarsarchive.byu.edu/etd/3351.

Full text
Abstract:
Chapter 1. The family Pteronarcyidae (Plecoptera) is a highly studied group of stoneflies and very important to a wide variety of aquatic studies. Several phylogenies have been proposed for this group recent decades, however there is little congruence between the various topologies. The present study revises the phylogeny of the group by combining molecular data from mitochondrial cytochrome oxidase subunit II, ribosomal subunit 12S, ribosomal subunit 16S, and nuclear loci ribosomal subinit 18S and Histone H3, with published morphological data in a parsimony-based total evidence analysis. The analysis produced a well-supported phylogeny with novel relationships within the genus Pteronarcys. Maximum Likelihood and Bayesian analyses produced topologies congruent with parsimony analysis. Character mapping revealed several homoplasious morphological characters that were previously thought to be homologous. Chapter 2. Phylogeographic studies in aquatic insects provide valuable insights into mechanisms that shape the genetic structure of aquatic communities. Yet studies that include broad geographic areas are uncommon for this group. We conducted a broad scale phylogeographic analysis of P. badia across western North America. In order to allow us to generate a larger mitochondrial data set, we used 454 seqeuncing to reconstruct the complete mitochondrial genome in the early stages of the project. Our analysis reveals what appears to be a complex history of isolation and multiple invasions among some lineages. The study provides evidence of multiple glacial refugia and suggests that historical climactic isolations have been important mechanisms in determining genetic structure of insects in western North America. Our ability to generate a large mitochondrial data set through mitochondrial genome reconstruction greatly improved nodal support of our mitochondrial gene tree, and allowed us to make stronger inference of relationships between lineages and timing of divergence events.
APA, Harvard, Vancouver, ISO, and other styles
42

Hess, Paul William. "Improving the Limit on the Electron EDM: Data Acquisition and Systematics Studies in the ACME Experiment." Thesis, Harvard University, 2014. http://dissertations.umi.com/gsas.harvard:11679.

Full text
Abstract:
The ACME collaboration has completed a measurement setting a new upper limit on the size of the electron's permanent electric dipole moment (EDM). The existence of the EDM is well motivated by theories extending the standard model of particle physics, with predicted sizes very close to the current experimental limit. The new limit was set by measuring spin precession within the metastable H state of the polar molecule thorium monoxide (ThO). A particular focus here is on the automated data acquisition system developed to search for a precession phase odd under internal and external reversal of the electric field. Automated switching of many different experimental controls allowed a rapid diagnosis of major systematics, including the dominant systematic caused by non-reversing electric fields and laser polarization gradients. Polarimetry measurements made it possible to quantify and minimize the polarization gradients in our state preparation and probe lasers. Three separate measurements were used to determine the electric field that did not reverse when we tried to switch the field direction. The new bound of |de|< 8.7 &times; 10<sup>-29</sup> e cm is over an order of magnitude smaller than previous limits, and strongly limits T-violating physics at TeV energy scales.<br>Physics
APA, Harvard, Vancouver, ISO, and other styles
43

Ödkvist, Magnus. "Konceptstudie av kombinerad nödgenerator-trädgårdsredskap : Concept Study of a Combined Emergency Generator-Garden tool." Thesis, Linköping University, Department of Mechanical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-3510.

Full text
Abstract:
<p>The present study concerns a concept development of a product idea that Bengt Magnuson, associate professor, had when a heavy storm caused a widespread power failure of long duration. With a basic power-producing unit one should be able to get through a power loss. The product could also be useful for other tasks if different modules are connected, e g grass mower or snow-blower.</p><p>According to the product idea the basic unit will be able to produce electric power enough for a fridge, freezer, TV, and a few electric lamps. Normally the basic unit can be used e g as a mower, when such a module is attached. An increased number of tasks would make the contraption useful and tempting for those who do not own an electrical power set in reserve. The study was limited, not involving the construction and characteristics of the power set, as such systems already exist. Power sets will be discussed according to size, weight, and effect to estimate there suitability in the base unit. Also was a consumers investigation not Per-formed, as the study is supposed to produce a basis for such an enquiry.</p><p>During the concept development, five ideas were compared in order to reach the best solution. This solution was compared to the alternative solutions which were brought forward at the critical scrutinisation of the product idea.</p><p>The result that was reached implied that the product would deflect buyers due to the considerable weight. Even if the weight would be reduced by making the generator unit lighter, two alternative solutions were considered better. One of them would use an alternator and put it on a modified mower with an extra shaft-coupling. The other one would be to add an alternator generator to a system already on the market, where the same motor could be moved between different products.</p><p>In conclusion: One should not continue further development of the original product idea but instead use one of the alternative solutions and perform a consumer’s investigation on that one. It is important as a need for an electricity supply in reserve seems to exist for many people outside the central urban areas.</p><br><p>Tyngdpunkten i arbetet ligger på konceptutveckling av en produktidé som docent Bengt Magnuson fick under våren 2005 i samband med stormen Gudrun. Med en strömförsörjande basenhet kan man dels klara sig genom ett strömavbrott och dels kan produkten få andra användningsområden genom att moduler med olika egenskaper, som t.ex. gräsklippare eller snöslunga, kan fästas på basenheten.</p><p>Enligt produktidén skall basenheten vid allmänt strömavbrott kunna producera tillräcklig effekt för kyl, frys, TV och några glödlampor. I dagligt bruk skall basenheten kunna användas som t.ex. en gräsklippare genom att en modul med denna funktion är påmonterad. Med fler användningsområden skall idén förhoppningsvis locka dem, som inte har ett reservaggregat, att köpa denna produkt.</p><p>Arbetet avgränsades såtillvida att generatoraggregatets konstruktion inte tas upp, då sådana system finns. Befintliga el-generatorer får stå som mall för basenhetens egenskaper angående mått, vikt och effekt. Någon kundundersökning ingår inte heller, eftersom föreliggande arbete får ses som framtagning av ett underlag för en sådan.</p><p>Under utvecklingen togs det fram fem konceptidéer som jämfördes med varandra för att man slutligen skulle komma fram till en lösning. Denna lösning jämfördes med alternativa lös-ningar, vilka togs fram under den kritiska granskningen av produktidén.</p><p>Det resultat som framkom var att produkten inte skulle få gensvar hos kunder med tanke på att den antagna vikten skulle anses alltför hög. Även om vikten kunde reduceras genom att generatorenheten görs lättare, var två alternativa lösningar bättre. Den ena innebär att en el-generator placeras på en modifierad gräsklippare med en extra utgående axel. Den andra inne-bär att sortimentet i ett befintligt system utökas med en generator, där samma motor kan flyttas mellan olika produkter.</p><p>Slutsatsen är att man inte skall fortsätta att utveckla produktidén utan inrikta sig på en av de alternativa lösningarna och genomföra en kundundersökning av denna. Behovet av en extra elförsörjningsmöjlighet finns trott allt hos många människor utanför storstäderna.</p>
APA, Harvard, Vancouver, ISO, and other styles
44

Ngo, Ho Anh Khoa. "Generative Probabilistic Alignment Models for Words and Subwords : a Systematic Exploration of the Limits and Potentials of Neural Parametrizations." Electronic Thesis or Diss., université Paris-Saclay, 2021. http://www.theses.fr/2021UPASG014.

Full text
Abstract:
L'alignement consiste à mettre en correspondance des unités au sein de bitextes, associant un texte en langue source et sa traduction dans une langue cible. L'alignement peut se concevoir à plusieurs niveaux: entre phrases, entre groupes de mots, entre mots, voire à un niveau plus fin lorsque l'une des langues est morphologiquement complexe, ce qui implique d'aligner des fragments de mot (morphèmes). L'alignement peut être envisagé également sur des structures linguistiques plus complexes des arbres ou des graphes. Il s'agit d'une tâche complexe, sous-spécifiée, que les humains réalisent avec difficulté. Son automatisation est un problème exemplaire du traitement des langues, historiquement associé aux premiers modèles de traduction probabilistes. L'arrivée à maturité de nouveaux modèles pour le traitement automatique des langues, reposant sur des représentationts distribuées calculées par des réseaux de neurones permet de reposer la question du calcul de ces alignements. Cette recherche vise donc à concevoir des modèles neuronaux susceptibles d'être appris sans supervision pour dépasser certaines des limitations des modèles d'alignement statistique et améliorer l'état de l'art en matière de précision des alignements automatiques<br>Alignment consists of establishing a mapping between units in a bitext, combining a text in a source language and its translation in a target language. Alignments can be computed at several levels: between documents, between sentences, between phrases, between words, or even between smaller units end when one of the languages is morphologically complex, which implies to align fragments of words (morphemes). Alignments can also be considered between more complex linguistic structures such as trees or graphs. This is a complex, under-specified task that humans accomplish with difficulty. Its automation is a notoriously difficult problem in natural language processing, historically associated with the first probabilistic word-based translation models. The design of new models for natural language processing, based on distributed representations computed by neural networks, allows us to question and revisit the computation of these alignments. This research project, therefore, aims to comprehensively understand the limitations of existing statistical alignment models and to design neural models that can be learned without supervision to overcome these drawbacks and to improve the state of art in terms of alignment accuracy
APA, Harvard, Vancouver, ISO, and other styles
45

Önnered, Simon. "Generating innovative ideas through systematic literature review and research synthesis : A design of a practical methodological framework for literature review." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-54999.

Full text
Abstract:
This is an action-oriented study aimed at designing a practical methodology for generating evidence backed solutions for practical problems by means of literature review. Three iterations of systematic review are applied which evaluates different search strategies and reporting structures to provide a framework for an ideation technique. Resulting in an adaptation of a previously used framework which can be deployed to different extents that appears to result in design propositions alongside individual interventions.
APA, Harvard, Vancouver, ISO, and other styles
46

Miralles, Tena Ignacio. "Analysis and Development of a Platform for Generating Context-Aware Apps for Mental Health." Doctoral thesis, Universitat Jaume I, 2019. http://hdl.handle.net/10803/668340.

Full text
Abstract:
This work gathers a research carried out on the use of context-aware technologies in their application to the field of mental health. It starts performing a review of mobile technologies used in psychological interventions and extracting the characteristics and disorders most discussed; it continues to propose considerations to take into account in the development of this type of technologies to increase its chances of success, focusing on three areas of study: Context, Mental Health and Technology; The next contribution is the process of developing a platform that allows therapists to create their own mobile applications with geolocation to customize their own intervention tools, the development is based on the learning obtained in the first two contributions, it is described as an intervention example a hypothetical case of depression; Finally, the platform is validated with three patients suffering from two different disorders: Panic disorder and agoraphobia; and gambling disorder.<br>Este trabajo describe la investigación realizada sobre el uso de tecnologías sensibles al contexto en su aplicación al campo de la salud mental. Comienza realizando una revisión sistemática del uso de smartphones en intervenciones psicológicas y extrae las características y trastornos más analizados; continúa proponiendo unas consideraciones a tener en cuenta en el desarrollo de este tipo de tecnologías para aumentar sus posibilidades de éxito, centrándose en tres áreas de estudio: contexto, salud mental y tecnología; la siguiente contribución es el proceso de desarrollar una plataforma que permita a los terapeutas crear sus propias aplicaciones móviles con geolocalización para personalizar sus propias herramientas de intervención, el desarrollo se basa en el aprendizaje obtenido en las dos primeras contribuciones; finalmente, la plataforma está validada con tres pacientes que sufren dos trastornos diferentes: trastorno de pánico y agorafobia; y trastorno del juego.
APA, Harvard, Vancouver, ISO, and other styles
47

Ergun, Eser. "Rethinking The Architectural Design Process Through Its Computable Body Of Knowledge." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609577/index.pdf.

Full text
Abstract:
This thesis assumes the architectural design process as a systematic study, in which knowledge is stored, organized and operated on by computational methods. From this perspective, the study explores the efforts for systemizing the architectural design process. Firstly, the focus is on the early approaches of systemizing design in the Design Methods Movement. The thesis identifies and evaluates the use of a number of critical concepts in this movement and in recent architecture practice, in order to see the development and transformation of design methods in terms of computing knowledge in a systematic way. The thesis evaluates the features that make design systematic within the Design Methods Movement and inquires whether such features like complexity, hierarchy, feedback loops and selection are influential in recent computational design methods of architecture. The thesis looks into two generative design methods, namely evolutionary design and shape grammars, which have been studied by designers since the 1960s, the start of the Design Methods Movement. These two methods exemplify current systematic approaches to design and according to the thesis these are the instances of how recent architecture employs the features discussed as characteristic in the Design Methods Movement.
APA, Harvard, Vancouver, ISO, and other styles
48

Peglow, Natalie Marion Elisabeth [Verfasser]. "Systematik zur Bewertung von Varianten in der Angebotsphase von Common-Rail Pumpen der automobilen Zulieferindustrie auf Basis des Modells der PGE - Produktgenerationsentwicklung = Systematics for Evaluation of Variants in the Quotation Phase of Common-Rail Pumps of the Automotive Supplier Industry on the Basis on the Model of PGE - Product Generation Engineering / Natalie Marion Elisabeth Peglow." Karlsruhe : KIT-Bibliothek, 2021. http://d-nb.info/1235072363/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Sen, Isabelle. "Stillasittande i tid hos barn och ungdomar : ”Today’s generation of children will be the first for over a century of whom life expectancy falls” : en systematisk litteraturstudie." Thesis, Gymnastik- och idrottshögskolan, GIH, Institutionen för idrotts- och hälsovetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:gih:diva-4103.

Full text
Abstract:
Syfte Syftet med denna systematiska litteraturstudie var att ta reda på om barn och ungdomars fysiska inaktivitet/stillasittande har förändrats de senaste 15 åren. Denna studie har undersökt hur stillasittande i tid har sett ut hos barn och ungdomar från olika delar av världen med hjälp av några studier. Frågeställning: Hur har trenden i stillasittande tid sett ut hos barn och ungdomar mellan 1997 och 2012? Metod Litteraturstudien påbörjades via databasen PubMed, under vårterminen 2015. Grunden för sökningarna baserades på detaljerade sökord med olika kombinationer. För att hitta väsentliga studier utformades inklusions- samt exklusionskriterier. Totalt inkluderades åtta studier som sedan fick genomgå en kvalitetsgranskning med hjälp av den modifierade STROBE–modellen. Därefter rankades studierna efter den kvalitet som de uppvisat. Resultat Mer än hälften av artiklarna fick enligt den modifierade STROBE-modellen hög kvalitet medan resten ansågs som låga. Resultatet för de studier som hade bedömts med hög kvalité hade ett medelvärde på 672 minuter/dag i stillasittande medan de med låg kvalitet fick ett genomsnittligt värde på 553 minuter/dag. Resultatet i studien visade även på att mellan år 1997 och 2012 hade stillasittande beteende ökat med två timmar och 40 minuter. Denna skillnad bedöms som positivt signifikant. Slutsats Bevisen för att stillasittandet har ökat hos barn och ungdomar genom åren är svaga. Det resultat som visades enligt denna litteraturstudie var att stillasittandet hade till viss del ökat med en positiv signifikant skillnad, mellan åren 1997 och 2012, om studierna jämförs emellan. Mer forskning inom detta område behövs då det finns för få studier inom objektiv mätning som sträcker sig längre tillbaks i tiden före år 1997.
APA, Harvard, Vancouver, ISO, and other styles
50

Ridout, Kate E. "Genome-wide analysis of selection in mammals, insects and fungi." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:5a894760-9240-4e79-a50f-37547f108a00.

Full text
Abstract:
Characterising and understanding factors that affect the rate of molecular evolution in proteins has played a major part in the development of evolutionary theory. The early analyses of amino acid substitutions stimulated the development of the neutral theory of molecular evolution, which later evolved into the nearly neutral theory. More recent work has lead to a better understanding of the role selection plays at the molecular level, but there is still limited understanding of how higher levels of protein organisation affect the way natural selection acts. The investigation of this question is the central aim of this thesis, which is addressed via the analysis of selective pressures in secondary protein structures in insects, mammals and fungi. The analyses for the first two groups were conducted using publically available datasets. To conduct the analyses in fungi, genome sequence data from the fungal genus Microbotryum (sequenced in our laboratory) was assembled and annotated, resulting in the development of a number of bioinformatics tools which are described here. The fungal, insect and mammalian datasets were interrogated with regard to a number of structural features, such as protein secondary structure, position of a site with regard to adaptively evolving sites, hydropathy and solvent-accessibility. These features were correlated with the signals of positive and purifying selection detected using phylogenetic maximum likelihood and Bayesian approaches. I conclude that all of the factors examined can have an effect on the rate of molecular evolution. In particular, disordered and hydrophilic regions of the protein are found to experience fewer physiochemical constraints and contain a higher proportion of adaptively evolving sites. It is also revealed that positively selected residues are ‘clustered’ together spatially, and these trends persist in the three taxa. Finally, I show that this variation in adaptive evolution is a result of both selective events and physiochemical constraint.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!