To see the other types of publications on this topic, follow the link: Predictive risk modeling.

Dissertations / Theses on the topic 'Predictive risk modeling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Predictive risk modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Vanichbuncha, Tita. "Risk Factors and Predictive Modeling for Aortic Aneurysm." Thesis, Linköpings universitet, Statistik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-80391.

Full text
Abstract:
In 1963 – 1965, a large-scale health screening survey was undertaken in Sweden and this data set was linked to data from the national cause of death register. The data set involved more than 60,000 participants whose age at death less than 80 years. During the follow-up period until 2007, a total of 437 (338 males and 99 females) participants died from aortic aneurysm. The survival analysis, continuation ratio model, and logistic regression were applied in order to identify significant risk factors. The Cox regression after stratification for AGE revealed that SEX, Blood Diastolic Pressure (BDP), and Beta-lipoprotein (BLP) were the most significant risk factors, followed by Cholesterol (KOL), Sialic Acid (SIA), height, Glutamic Oxalactic Transaminase, Urinary glucose (URIN_SOC), and Blood Systolic Pressure (BSP). Moreover, SEX and BDP were found as risk factors in almost every age group. Furthermore, BDP was strongly significant in both male and female subgroup.   The data set was divided into two sets: 70 percent for the training set and 30 percent for the test set in order to find the best technique for predicting aortic aneurysm. Five techniques were implemented: the Cox regression, the continuation ratio model, the logistic regression, the back-propagated artificial neural network, and the decision tree. The performance of each technique was evaluated by using area under the receiver operating characteristic curve. In our study, the continuation ratio and the logistic regression outperformed among the other techniques.
APA, Harvard, Vancouver, ISO, and other styles
2

Ding, Xiuhua. "MODELING DEMENTIA RISK, COGNITIVE CHANGE, PREDICTIVE RULES IN LONGITUDINAL STUDIES." UKnowledge, 2016. http://uknowledge.uky.edu/epb_etds/9.

Full text
Abstract:
Dementia is increasing recognized as a major problem to public health worldwide. Prevention and treatment strategies are in critical need. Nowadays, research for dementia usually featured as complex longitudinal studies, which provide extensive information and also propose challenge to statistical methodology. The purpose of this dissertation research was to apply statistical methodology in the field of dementia to strengthen the understanding of dementia from three perspectives: 1) Application of statistical methodology to investigate the association between potential risk factors and incident dementia. 2) Application of statistical methodology to analyze changes over time, or trajectory, in cognitive tests and symptoms. 3) Application of statistical learning methods to predict development of dementia in the future. Prevention of Alzheimer’s disease with Vitamin E and Selenium (PREADViSE) (7547 subjects included) and Alzheimer’s disease Neuroimaging Initiative (ADNI) (591 participants included) were used in this dissertation. The first study, “Self-reported sleep apnea and dementia risk: Findings from the PREADViSE Alzheimer’s disease prevention trial ”, shows that self-reported baseline history of sleep apnea was borderline significantly associated with risk of dementia after adjustment for confounding. Stratified analysis by APOE ε4 carrier status showed that baseline history of sleep apnea was associated with significantly increased risk of dementia in APOE ε4 non-carriers. The second study, “comparison of trajectories of episodic memory for over 10 years between baseline normal and MCI ADNI subjects,” shows that estimated 30% normal subjects at baseline assigned to group 3 and 6 stay stable for over 9 years, and normal subjects at baseline assigned to Group 1 (18.18%) and Group 5 (16.67%) were more likely to develop into dementia. In contrast to groups identified for normal subjects, all trajectory groups for MCI subjects at baseline showed the tendency to decline. The third study, “comparison between neural network and logistic regression in PREADViSE trial,” demonstrates that neural network has slightly better predictive performance than logistic regression, and also it can reveal complex relationships among covariates. In third study, the effect of years of education on response variable depends on years of age, status of APOE ɛ4 allele and memory change.
APA, Harvard, Vancouver, ISO, and other styles
3

Villa, Zapata Lorenzo Andrés. "Predictive Modeling Using a Nationally Representative Database to Identify Patients at Risk of Developing Microalbuminuria." Diss., The University of Arizona, 2014. http://hdl.handle.net/10150/333040.

Full text
Abstract:
Background: Predictive models allow clinicians to more accurately identify higher- and lower-risk patients and make more targeted treatment decisions, which can help improve efficiency in health systems. Microalbuminuria (MA) is a condition characterized by the presence of albumin in the urine below the threshold detectable by a standard dipstick. Its presence is understood to be an early marker for cardiovascular disease. Therefore, identifying patients at risk for MA and intervening to treat or prevent conditions associated with MA, such as high blood pressure or high blood glucose, may support cost-effective treatment. Methods: The National Health and Nutrition Examination Survey (NHANES) was utilized to create predictive models for MA. This database includes clinical, medical and laboratory data. The dataset was split into thirds; one-third was used to develop the model, while the other two-thirds were utilized to validate the model. Univariate logistic regression was performed to identify variables related with MA. Stepwise multivariate logistic regression was performed to create the models. Model performance was evaluated using three criteria: 1) receiver operator characteristic (ROC) curves; 2) pseudo R-squared; and 3) goodness of fit (Hosmer-Lemeshow). The predictive models were then used to develop risk-scores. Results: Two models were developed using variables that had significant correlations in the univariate analysis (p-value<0.05). For Model A, variables included in the final model were: systolic blood pressure (SBP); fasting glucose; C-reactive protein; blood urea nitrogen (BUN); and alcohol consumption. For Model B, the variables were: SBP; glycohemoglobin; BUN; smoking status; and alcohol consumption. Both models performed well in the creation dataset and no significant difference between the models was found when they were evaluated in the validation set. A 0-18 risk score was developed utilizing Model A, and the predictive probability of developing MA was calculated. Conclusion: The predictive models developed provide new evidence about which variables are related with MA and may be used by clinicians to identify at-risk patients and to tailor treatment. Furthermore, the risk score developed using Model A may allow clinicians to more easily measure patient risk. Both predictive models will require external validation before they can be applied to other populations.
APA, Harvard, Vancouver, ISO, and other styles
4

Villaume, Erik. "Predicting customer level risk patterns in non-life insurance." Thesis, KTH, Matematisk statistik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-103590.

Full text
Abstract:
Several models for predicting future customer profitability early into customer life-cycles in the property and casualty business are constructed and studied. The objective is to model risk at a customer level with input data available early into a private consumer’s lifespan. Two retained models, one using Generalized Linear Model another using a multilayer perceptron, a special form of Artificial Neural Network are evaluated using actual data. Numerical results show that differentiation on estimated future risk is most effective for customers with highest claim frequencies.
APA, Harvard, Vancouver, ISO, and other styles
5

Fonti, Mary L. "A Predictive Modeling System: Early identification of students at-risk enrolled in online learning programs." NSUWorks, 2015. http://nsuworks.nova.edu/gscis_etd/367.

Full text
Abstract:
Predictive statistical modeling shows promise in accurately predicting academic performance for students enrolled in online programs. This approach has proven effective in accurately identifying students who are at-risk enabling instructors to provide instructional intervention. While the potential benefits of statistical modeling is significant, implementations have proven to be complex, costly, and difficult to maintain. To address these issues, the purpose of this study is to develop a fully integrated, automated predictive modeling system (PMS) that is flexible, easy to use, and portable to identify students who are potentially at-risk for not succeeding in a course they are currently enrolled in. Dynamic and static variables from a student system (edX) will be analyzed to predict academic performance of an individual student or entire class. The PMS model framework will include development of an open-source Web application, application programming interface (API), and SQL reporting services (SSRS). The model is based on knowledge discovery database (KDD) approach utilizing inductive logic programming language (ILP) to analyze student data. This alternative approach for predicting academic performance has several unique advantages over current predictive modeling techniques in use and is a promising new direction in educational research.
APA, Harvard, Vancouver, ISO, and other styles
6

Rosile, Paul A. "Modeling Biotic and Abiotic Drivers of Public Health Risk from West Nile Virus in Ohio, 2002-2006." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1405380213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ebrahimvandi, Alireza. "Three Essays on Analysis of U.S. Infant Mortality Using Systems and Data Science Approaches." Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/96266.

Full text
Abstract:
High infant mortality (IM) rates in the U.S. have been a major public health concern for decades. Many studies have focused on understanding causes, risk factors, and interventions that can reduce IM. However, death of an infant is the result of the interplay between many risk factors, which in some cases can be traced to the infancy of their parents. Consequently, these complex interactions challenge the effectiveness of many interventions. The long-term goal of this study is to advance the common understanding of effective interventions for improving health outcomes and, in particular, infant mortality. To achieve this goal, I implemented systems and data science methods in three essays to contribute to the understanding of IM causes and risk factors. In the first study, the goal was to identify patterns in the leading causes of infant mortality across states that successfully reduced their IM rates. I explore the trends at the state-level between 2000 and 2015 to identify patterns in the leading causes of IM. This study shows that the main drivers of IM rate reduction is the preterm-related mortality rate. The second study builds on these findings and investigates the risk factors of preterm birth (PTB) in the largest obstetric population that has ever been studied in this field. By applying the latest statistical and machine learning techniques, I study the PTB risk factors that are both generalizable and identifiable during the early stages of pregnancy. A major finding of this study is that socioeconomic factors such as parent education are more important than generally known factors such as race in the prediction of PTB. This finding is significant evidence for theories like Lifecourse, which postulate that the main determinants of a health trajectory are the social scaffolding that addresses the upstream roots of health. These results point to the need for more comprehensive approaches that change the focus from medical interventions during pregnancy to the time where mothers become vulnerable to the risk factors of PTB. Therefore, in the third study, I take an aggregate approach to study the dynamics of population health that results in undesirable outcomes in major indicators like infant mortality. Based on these new explanations, I offer a systematic approach that can help in addressing adverse birth outcomes—including high infant mortality and preterm birth rates—which is the central contribution of this dissertation. In conclusion, this dissertation contributes to a better understanding of the complexities in infant mortality and health-related policies. This work contributes to the body of literature both in terms of the application of statistical and machine learning techniques, as well as in advancing health-related theories.<br>Doctor of Philosophy<br>The U.S. infant mortality rate (IMR) is 71% higher than the average rate for comparable countries in the Organization for Economic Co-operation and Development (OECD). High infant mortality and preterm birth rates (PBR) are major public health concerns in the U.S. A wide range of studies have focused on understanding the causes and risk factors of infant mortality and interventions that can reduce it. However, infant mortality is a complex phenomenon that challenges the effectiveness of the interventions, and the IMR and PBR in the U.S. are still higher than any other advanced OECD nation. I believe that systems and data science methods can help in enhancing our understanding of infant mortality causes, risk factors, and effective interventions. There are more than 130 diagnoses—causes—for infant mortality. Therefore, for 50 states tracking the causes of infant mortality trends over a long time period is very challenging. In the first essay, I focus on the medical aspects of infant mortality to find the causes that helped the reduction of the infant mortality rates in certain states from 2000 to 2015. In addition, I investigate the relationship between different risk factors with infant mortality in a regression model to investigate and find significant correlations. This study provides critical recommendations to policymakers in states with high infant mortality rates and guides them on leveraging appropriate interventions. Preterm birth (PTB) is the most significant contributor to the IMR. The first study showed that a reduction in infant mortality happened in states that reduced their preterm birth. There exists a considerable body of literature on identifying the PTB risk factors in order to find possible explanations for consistently high rates of PTB and IMR in the U.S. However, they have fallen short in two key areas: generalizability and being able to detect PTB in early pregnancy. In the second essay, I investigate a wide range of risk factors in the largest obstetric population that has ever been studied in PTB research. The predictors in this study consist of a wide range of variables from environmental (e.g., air pollution) to medical (e.g., history of hypertension) factors. Our objective is to increase the understanding of factors that are both generalizable and identifiable during the early stage of pregnancy. I implemented state-of-the-art statistical and machine learning techniques and improved the performance measures compared to the previous studies. The results of this study reveal the importance of socioeconomic factors such as, parent education, which can be as important as biomedical indicators like the mother's body mass index in predicting preterm delivery. The second study showed an important relationship between socioeconomic factors such as, education and major health outcomes such as preterm birth. Short-term interventions that focus on improving the socioeconomic status of a mother during pregnancy have limited to no effect on birth outcomes. Therefore, we need to implement more comprehensive approaches and change the focus from medical interventions during pregnancy to the time where mothers become vulnerable to the risk factors of PTB. Hence, we use a systematic approach in the third study to explore the dynamics of health over time. This is a novel study, which enhances our understanding of the complex interactions between health and socioeconomic factors over time. I explore why some communities experience the downward spiral of health deterioration, how resources are generated and allocated, how the generation and allocation mechanisms are interconnected, and why we can see significantly different health outcomes across otherwise similar states. I use Ohio as the case study, because it suffers from poor health outcomes despite having one of the best healthcare systems in the nation. The results identify the trap of health expenditure and how an external financial shock can exacerbate health and socioeconomic factors in such a community. I demonstrate how overspending or underspending in healthcare can affect health outcomes in a society in the long-term. Overall, this dissertation contributes to a better understanding of the complexities associated with major health issues of the U.S. I provide health professionals with theoretical and empirical foundations of risk assessment for reducing infant mortality and preterm birth. In addition, this study provides a systematic perspective on the issue of health deterioration that many communities in the US are experiencing, and hope that this perspective improves policymakers' decision-making.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhai, Jian. "Modeling Firms’ Productivity and Borrowers’ Default Risk." Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/25393.

Full text
Abstract:
In the thesis, we study improvements in the precision of inefficiency estimation by considering the dependence information among the error terms in models of production. We propose that the production systems that account for technical and allocative inefficiencies offer a natural way to model dependence using vine copulas. We construct such vine copulas using a recently proposed family of bivariate copulas (APS-2 copula) that permit dependence between the magnitude (but not the sign) of the allocative inefficiency and the magnitude of the technical inefficiency, and a Gaussian copula. We show how to estimate such models and argue that they better reflect dependencies that arise in practice. Following that we study around 200 features of personal loan customers, and apply machine learning models to see how these characteristics can help with risk measurement. The cost for different misclassification errors are different for personal loan providers. Furthermore, the misclassification costs for each example are different. Therefore, we propose an example-dependent cost-sensitive gradient boosting model and apply the method to a personal loan dataset.
APA, Harvard, Vancouver, ISO, and other styles
9

Mesgarpour, Mohsen. "Predictive risk modelling of hospital emergency readmission, and temporal comorbidity index modelling using machine learning methods." Thesis, University of Westminster, 2017. https://westminsterresearch.westminster.ac.uk/item/q3031/predictive-risk-modelling-of-hospital-emergency-readmission-and-temporal-comorbidity-index-modelling-using-machine-learning-methods.

Full text
Abstract:
This thesis considers applications of machine learning techniques in hospital emergency readmission and comorbidity risk problems, using healthcare administrative data. The aim is to introduce generic and robust solution approaches that can be applied to different healthcare settings. Existing solution methods and techniques of predictive risk modelling of hospital emergency readmission and comorbidity risk modelling are reviewed. Several modelling approaches, including Logistic Regression, Bayes Point Machine, Random Forest and Deep Neural Network are considered. Firstly, a framework is proposed for pre-processing hospital administrative data, including data preparation, feature generation and feature selection. Then, the Ensemble Risk Modelling of Hospital Readmission (ERMER) is presented, which is a generative ensemble risk model of hospital readmission model. After that, the Temporal-Comorbidity Adjusted Risk of Emergency Readmission (T-CARER) is presented for identifying very sick comorbid patients. A Random Forest and a Deep Neural Network are used to model risks of temporal comorbidity, operations and complications of patients using the T-CARER. The computational results and benchmarking are presented using real data from Hospital Episode Statistics (HES) with several samples across a ten-year period. The models select features from a large pool of generated features, add temporal dimensions into the models and provide highly accurate and precise models of problems with complex structures. The performances of all the models have been evaluated across different timeframes, sub-populations and samples, as well as previous models.
APA, Harvard, Vancouver, ISO, and other styles
10

Atan, Ismail Bin. "Stochastic modelling of streamflow for predicting seasonal flood risk." Thesis, University of Newcastle Upon Tyne, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.242379.

Full text
Abstract:
Hydrological time series are often asymmetric in time, insomuch as rises are more rapid than recessions, as well as having highly skewed marginal distributions. A two-stage transformation is proposed for deseasonalised daily average flow series. Rises are stretched, and recessions are squashed until the series is symmetric over time. An autoregressive moving average (ARMA) model is then fitted to the natural logarithms of this new series The residuals from the ARMA model are represented by Weibull distributions. Seasonal flood risks, as daily average flows, are estimated by simulation. However, floods are often measured as peak flows rather than daily average flows, although both measures are relevant, and the use of growth factors to allow for this is demonstrated. The method is demonstrated with 24 years of daily flows from River Cherwell in the south of England, a 40-years record from the upper reaches of the Thames and 21-years record from the River Coquet in the north-east of England. Seasonal estimates of flood risk are given, and these can be conditioned on catchment wetness at the time of prediction. Comparisons with other methods which allow for time irreversibility are also made. One is ARMA models with exogenous input, in this case rainfall, which will, because of its intermittent nature, impact a natural time irreversibility to the streamflow series. A disadvantage of these models is that they require rainfall data in addition to the streamflow record. A second is the development of a class of shot noise models, which naturally generate highly time irreversibility series. This is the Neyman-Scott model. But, despite its attractive physical interpretation it is inevitably less flexible than the two stage transformation because it has fewer parameters. Although it was found to provide a good fit to daily data it is less convincing for the extremes. Overall the two stage transformation (TST) compared favourably with both models.
APA, Harvard, Vancouver, ISO, and other styles
11

Obajemu, Olusayo. "Predictive dynamic risk mapping and modelling of patients diagnosed with bladder cancer." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/14368/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lindgren, Cory John. "Addressing the risks of invasive plants through spatial predictive modelling." Canadian Journal of Plant Science, 2010. http://hdl.handle.net/1993/18344.

Full text
Abstract:
The objective of this dissertation is to extend the use of spatial predictive modelling for use by biosecurity agencies to help prevent the introductions of new and emerging invasive plants (i.e., pests). A critical review of international and national policy instruments found that they did not effectively articulate how spatial predictive modelling could be incorporated into the biosecurity toolbox. To determine how spatial predictive modelling could be extended I modelled the potential distribution of Tamarix and Lythrum salicaria in Prairie Canada using a genetic algorithm. New seasonal growth data was used to interpolate a growing degree-day’s risk surface for L. salicaria. Models were developed using suites of predictive variables as well as different data partitioning methods and evaluated using different performance measures. Expert evaluation was found to important in final model selection. The results indicated that both invasive plants have yet to reach their potential distribution in Prairie Canada. The spatial models can be used to direct risk-based surveillance efforts and to support biosecurity policy decisions. The results of this dissertation conclude that spatial predictive modelling is an informative tool that needs to be incorporated into the biosecurity toolbox. A phytosanitary standard is proposed to guide toolbox development.
APA, Harvard, Vancouver, ISO, and other styles
13

Zhang, Jinghan. "The invasion of Smooth cordgrass (Spartina alterniflora) in China : risk assessment using spatial modeling." Thesis, Uppsala universitet, Växtekologi och evolution, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-176877.

Full text
Abstract:
Smooth cordgrass (Spartina alterniflora) is one of the harmful quarantine weeds in China. Since its first introduction in China in 1979, this alien species has spread rapidly and damaged local ecological environments. Research to predict a suitable new area is an important step for management of the species and to prevent a further spread. In this study, Spartina alterniflora’s ecological niche was modeled using the application MAXENT. Analysis was based on species’ current distribution. The investigations of this study were two-fold. First, a large-scale global investigation (outside China) was conducted to predict suitable areas in China by comparing global and Chinese records of the species. In the second set, the combined records were used to predict suitable areas in the Jiangsu Province. The model’s accuracy was evaluated by Receiver Operator Characteristic (ROC) curve. The areas under the ROC curve (AUC value) were all over 0.95, which indicated high predictive ability of this model. In the large scale prediction, Shanghai, Zhejiang, Fujian, Guangzhou, Guangxi and southern part of Wuhan, Jiangsu and Anhui were all potentially endangered by S. alterniflora invasion. On the smaller scale, the prone to invasion areas were mostly concentrated on southern part and some coastal areas of Jiangsu Province, where the precipitation and temperature were appropriate for this grass. Because of S. alterniflora has high dispersal ability and human induced history, the potential distribution areas in China are considerable and it may invade more areas, in result spreading faster in the future. To prevent further invasion and spread, an early eradication program should be adopted in the newly invaded areas. Meanwhile, the monitoring programs should also need to be applied in potential survival areas, especially in coastal harbors, airports, and tourism areas which are highly vulnerable to S. alterniflora invasion.
APA, Harvard, Vancouver, ISO, and other styles
14

Shaikhina, Torgyn. "Machine learning with limited information : risk stratification and predictive modelling for clinical applications." Thesis, University of Warwick, 2017. http://wrap.warwick.ac.uk/99640/.

Full text
Abstract:
The high cost, complexity and multimodality of clinical data collection restrain the datasets available for predictive modelling using machine learning (ML), thus necessitating new data-efficient approaches specifically for limited datasets. This interdisciplinary thesis focuses on clinical outcome modelling using a range of ML techniques, including artificial neural networks (NNs) and their ensembles, decision trees (DTs) and random forests (RFs), as well as classical logistic regression (LR) and Cox proportional hazards (Cox PH) models. The utility of ML for data-efficient regression, classification and survival analyses was investigated in three clinical applications, whereby exposing the common limitations inherent in patient data, such as class imbalance, incomplete samples, and, in particular, limited dataset size. The latter problem was addressed by developing a methodological framework for learning from datasets with less than 10 observations per predictor variable. A novel method of multiple runs overcame the volatility of NN and DT models due to limited training samples, while a surrogate data test allowed for regression model evaluation in the presence of noise due to limited dataset size. When applied to hard tissue engineering for predicting femoral fracture risk, the framework resulted in 98.3% accurate regression NN. The framework was used to detect early rejection in antibody- incompatible kidney transplantation, achieving 85% accurate classification DT. The third clinical task – that of predicting 10-year incidence of type 2 diabetes in the UK population – resulted in 70-85% accurate classification and survival models, whilst highlighting the challenges of learning with the limited information characteristic of routinely collected data. By discovering unintuitive patterns, supporting existing hypotheses and generating novel insight, the ML models developed in this research contributed meaningfully to clinical research and paved the way for data-efficient applications of ML in engineering and clinical practice.
APA, Harvard, Vancouver, ISO, and other styles
15

Jiang, Peng. "A Hybrid Risk Model for Hip Fracture Prediction." Diss., The University of Arizona, 2015. http://hdl.handle.net/10150/579112.

Full text
Abstract:
Hip fracture has long been considered as the most serious consequence of osteoporosis, which includes chronic pain, disability, and even death. In the elderly population, a femur fracture is very common. It is assessed that 50% of women aged 50 or older may experience a hip fracture in their remaining life. Hip fracture is among the most common injuries and can lead to substantial morbidity and mortality. In the US alone, over 250,000 hip fractures occur each year and this number is expected to double by the year 2040. Statistics indicate that over 20% of people who experience a hip fracture die within one year and only 25% have a total recovery. Femur fractures are now becoming a major social and economic burden on the health care system. In practice, it is very difficult to predict the femur fracture risks. One of the main reasons is that there is not a robust and easy-to-get measure to quantify the strength of the bone. Clinicians use bone mineral density (BMD) as an indicator of osteoporosis and fracture risk. Several studies showed that BMD cannot be used alone to identify bone strength. In fact, the majority of patients who suffer from fractures have normal or even higher BMD scores. There are a large number of risk factors that contribute to the occurrence of femur fracture, which should also be involved in predicting hip fracture risks. For example, age, weight, height, ethnicity and so on. Some of the factors might not have been identified yet. Thus, there will be a high level of uncertainty in the clinical dataset, which makes it difficult to construct and validate a hip risk prediction model. The objective of the dissertation is to construct an improved hip fracture risk prediction model. Due to the difficulty of obtaining experimental or clinical data, computational simulations might help increase the predictive ability of the risk model. In this research, the hip fracture risk model is based on a support vector machine (SVM) trained using a clinical dataset from the Women's Health Initiative (WHI). In order to improve the SVM-based hip fracture risk model, data from a fully parameterized finite element (FE) model is used to supplement the clinical dataset. This FE model allows one to simulate a wide range of geometries and material properties in the hip region, and provides a measure of risk based on mechanical quantities (e.g., strain). This dissertation presents new approaches to fuse the clinical data with the FE data in order to improve the predictive capability of the hip fracture risk prediction model. Two approaches are introduced in this dissertation to construct a hybrid risk model: an "augmented space" approach and a "computational patients" approach. This work has led to the construction of a new online hip fracture risk calculator with free access.
APA, Harvard, Vancouver, ISO, and other styles
16

Koechlin, Kathleen Marie. "Modeling childhood agricultural injury risk with composite measurement scales." Columbus, Ohio : Ohio State University, 2003. http://rave.ohiolink.edu/etdc/view?acc%5num=osu1064287970.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2003.<br>Title from first page of PDF file. Document formatted into pages; contains xxi, 308 p. : ill. (some col.). Advisor: J.R. Wilkins III, School of Public Health. Includes bibliographical references (p. 210-220).
APA, Harvard, Vancouver, ISO, and other styles
17

Xu, Chang. "Index-Based Insurance, Informal Risk Sharing, and Agricultural Yields Prediction." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1529794733186832.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Dalla, Fontana Silvia <1991&gt. "Credit risk modelling and valuation: testing credit rating accuracy in default prediction." Master's Degree Thesis, Università Ca' Foscari Venezia, 2017. http://hdl.handle.net/10579/9894.

Full text
Abstract:
Credit risk is a forward-looking concept, focusing on the probability of facing credit difficulties in the future. Credit difficulties are represented by the risk of not being paid for goods or services sold to customers. This kind of risk involves all companies from financial services industry to consumer goods. Credit risk has acquired growing importance in recent years which have been characterized by a negative economic situation, started with the US subprime mortgage crisis and the collapse of Lehman Brothers in 2008. The financial crisis intervened before Basel II could become fully effective, and unveiled the fragilities of the financial system in general, but also emphasised the inadequacy of both credit risk management and the connected credit rating system carried out by ECAIs. In Chapter I, starting from an historical excursus, the study deals with credit risk methods and rating capability to predict firms’ probability of default, taking into account both quantitative and qualitative methods and the consequent credit rating assessment. In Chapter II we focus on the trade credit insurance case. Credit insurance allows companies of any size to protect against the risk of not being paid, and this consequently increases firm’s profitability thanks to higher client portfolio quality. This means that the analysis of creditworthiness includes a wide population, from SMEs to large corporates. In Chapter III we provide an empirical analysis on the accuracy of rating system: we start from dealing with the distribution of the Probability of Default and firms’ allocation in PD classes, we analyse the Gini coefficient’s adequacy in measuring rating accuracy and we deal with a multiple regression model based on financial indicators. Finally we conclude with reflections and final comments.
APA, Harvard, Vancouver, ISO, and other styles
19

Ruddy, Joshua D. "Predictive modelling of self-reported wellness and the risk of injury in elite Australian footballers." Phd thesis, Australian Catholic University, 2020. https://acuresearchbank.acu.edu.au/download/9fc2c9d8146dd478d146ed145468baa828d7ec88a9f41a7ff2222306d7768403/4540065/Ruddy_2020_Predictive_modelling_self-reportred_wellness_Australian.pdf.

Full text
Abstract:
Injuries are a common occurrence in team sports and can have significant financial, physical and psychological consequences for athletes and their sporting organisations. As such, an abundance of research has attempted to identify factors associated with the risk of injury, which is important when developing injury prevention and risk mitigation strategies. Traditionally, research has implemented reductionist approaches to identify injury risk factors. These reductionist methodologies assume that all the parts of a system (in this case, injury aetiology) can be broken down and examined individually and then summed together to represent the system as a whole. Reductionist approaches are useful in establishing associations between specific factors and the risk of injury. However, in order to predict the occurrence of injuries at an individual level, complex approaches should be implemented. In light of this, machine learning has been suggested as an appropriate method of applying complex approaches to the prediction of injuries in sport. Machine learning is a field of computer science which involves building algorithms to learn from data and make predictions without being programmed what to look for or where to look for it. Whilst machine learning cannot be used to establish causal relationships between specific factors and the occurrence of injuries, it differs from reductionist methodologies in that it has the ability to identify the complex, non-linear interactions that occur amongst risk factors. Study 1 (Chapter 4) aimed to utilise machine learning methods to predict the occurrence of hamstring strain injuries (HSIs) in elite Australian footballers. Hamstring strain injury is the most common injury in elite Australian football and three of the most consistently identified risk factors for HSI are increasing age, prior HSI and low levels of eccentric knee flexor strength. While some iterations of the predictive models achieved near perfect performance (maximum area under the curve [AUC] = 0.92), others performed worse than random chance (minimum AUC = 0.24). It was concluded that age, previous HSI and eccentric knee flexor strength data could not be used to identify Australian footballers at an increased risk of HSI with any consistency, despite these factors being highly associated with the risk of HSI. It is suggested that more observed injuries, in addition to more frequent measures of the variables included in the models, may have improved the performance of the predictive models in Study 1. To overcome the limitations acknowledged in Study 1, Study 2 (Chapter 5) investigated whether more frequent measures of the impact of prior injury (in the form of session availability), in addition to a greater number of observed injuries (albeit non-specific pathologies), improved the ability to identify injury risk. It was observed that greater session availability in the previous 7 days increased injury probabilities by up to 4.4%. Additionally, lesser session availability in the previous 84 days increased injury probabilities by up to 14.1%, only when coupled with greater availability in the previous 7 days. It was concluded that session availability may provide an informative marker of the impact of prior injury on subsequent injury risk and can be used by practitioners to guide the progression of training, particularly for athletes that are returning from long periods of injury. Study 1 and Study 2 implemented complex approaches in an attempt to improve injury risk identification at an individual level. Despite the findings of Study 1 and Study 2, quantifying injury risk on a daily basis remains a complex and challenging task for practitioners working in Australian football. Commonly implemented tools such as self-reported wellness questionnaires provide a much more accessible measure of athletes’ wellbeing and how they are responding to the demands of training/competition. Whilst improving the ability to estimate injury risk at an individual level is an important focus area, it may also be important to determine the level of information that more accessible and more frequently measured variables (such as self-reported wellness) provide regarding injury risk. To make this determination, however, it is also necessary to understand the factors that directly influence self-reported wellness. Accordingly, Study 3 (Chapter 6) aimed to investigate the factors that impact wellness in elite Australian footballers. Measures of external load examined on their own were able to explain changes in wellness to a large degree (root mean square error = 1.55, 95% confidence intervals = 1.52 to 1.57). However, there was a proportion of wellness that could not be explained by external loads. It is suggested that examining the interaction between external training loads and self-reported wellness may assist practitioners in their load management strategies. However, there is limited research investigating the interaction between external loads and wellness and the impact this information may have on subsequent injury risk. Accordingly, Study 4 (Chapter 7) aimed to investigate the ability of external load data, session availability data and self-reported wellness data, as well as the interaction between the three, to identify the risk of lower limb non-contact injuries in elite Australian footballers. The model with the least input variables (athlete ID and session type) displayed the highest predictive ability (AUC = 0.76, Akaike information criterion [AIC] = 479, Brier score = 0.009). The models built using external load, session availability and wellness data all displayed similar predictive ability (AUCs = 0.72 to 0.75, AICs = 477 to 478, Brier scores = 0.009 to 0.009). Despite observing higher predictive performance compared to previous research, the addition of external load, session availability and wellness data, as well as demographic and pre-season external load data, did not improve the ability to predict lower limb non-contact injuries in Study 4. Overall, this program of research displayed a limited ability to predict injuries in elite Australian football. The findings of this thesis highlight a need for a larger number of observed injuries when implementing predictive modelling strategies to identify injury risk at an individual level. Despite this, the predictive modelling strategies implemented in this thesis may assist researchers and practitioners in better understanding the relationships that exist between variables that are commonly collected, analysed and interpreted. Whilst the efficacy of complex approaches and their application in sports research may warrant further investigation, researchers and practitioners alike need to strongly consider the limitations of input data and the predictive modelling strategies used to analyse these data when conducting (as well as interpreting) future research.
APA, Harvard, Vancouver, ISO, and other styles
20

Gephart, Sheila Maria. "Validating a Neonatal Risk Index to Predict Necrotizing Enterocolitis." Diss., The University of Arizona, 2012. http://hdl.handle.net/10150/228155.

Full text
Abstract:
Necrotizing enterocolitis (NEC) is a costly and deadly disease in neonates. Composite risk for NEC is poorly understood and consensus has not been established on the relevance of risk factors. This two-phase study attempted to validate and test a neonatal NEC risk index, GutCheck(NEC). Phase I used an E-Delphi methodology in which experts (n=35) rated the relevance of 64 potential NEC risk factors. Items were retained if they achieved predefined levels of expert consensus or stability. After three rounds, 43 items were retained (CVI=.77). Qualitative analysis revealed two broad themes: individual characteristics of vulnerability and the impact of contextual variation within the NICU on NEC risk. In Phase II, the predictive validity of GutCheck(NEC) was evaluated using a sample from the Pediatrix BabySteps Clinical Data Warehouse (CDW). The sample included infants born<1500 grams, before 36 weeks, and without congenital anomalies or spontaneous intestinal perforation (N=58,818, of which n=35,005 for empiric derivation and n=23,813 for empiric validation). Backward stepwise likelihood-ratio method regression was used to reduce the number of predictive factors in GutCheck(NEC) to 11 and derive empiric weights. Items in the final GutCheck(NEC) were gestational age, history of a transfusion, NICU-specific NEC risk, late onset sepsis, multiple infections, hypotension treated with Inotropic medications, Black or Hispanic race, outborn status, metabolic acidosis, human milk feeding on both day 7 and day 14 (reduces risk) and probiotics (reduces risk).Discrimination was fair in the case-control sample (AUC=.67, 95% CI .61-.73) but better in the validation set (AUC=.76, 95% CI .75-.78) and best for surgical NEC (AUC=.84, 95% CI .82-.84) and infants who died from NEC (AUC=.83, 95% CI .81-.85). A GutCheck(NEC) score of 33 (range 0-58) yielded a sensitivity of .78 and a specificity of .74 in the validation set. Intra-individual reliability was acceptable (ICC (19) =.97, p<.001). Future research is needed to repeat this procedure in infants between 1500 and 2500 grams, complete psychometric testing, and explore unit variation in NEC rates using a comprehensive approach.
APA, Harvard, Vancouver, ISO, and other styles
21

Hartley, Stephen William. "Bayesian methods for multivariate modeling of pleiotropic single-nucleotide polymorphisms and genetic risk prediction." Thesis, Boston University, 2012. https://hdl.handle.net/2144/12416.

Full text
Abstract:
Thesis (Ph.D.)--Boston University PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you.<br>Genome-wide association studies (GW AS) have identified numerous associations between genetic loci and individual phenotypes; however, relatively few GWAS have attempted to detect pleiotropic associations, in which loci are simultaneously associated with multiple distinct phenotypes. In this thesis, we show that pleiotropic single nucleotide polymorphism (SNP) associations can be directly modeled via the construction of simple Bayesian networks, and that these models can be applied to produce Bayesian classifiers that leverage pleiotropy to improve genetic risk prediction. We then demonstrate the effectiveness ofthese methods in both simulated and real data. The proposed method includes two phases: first, SNPs are fitted to models and ranked by the strength of evidence of association; second, the final feature set and classification rule is selected using cross validation prediction. The final classifiers can then be used to test the validity of the candidate genes as well as for diagnostic and prognostic purposes. Multiple genetic risk prediction methods were developed and tested. Multiple phenotypes can be predicted jointly, or alternatively, a phenotype of interest can be predicted, either conditionally given known secondary phenotype status, or marginally across unknown secondary phenotype statuses. Furthermore, prediction can be carried out using either single classifiers or ensembles of classifiers. To demonstrate the capabilities and limitations ofthese methods, several hundred GWAS were simulated under various effect strengths, sample sizes, and phenotype distributions. Multiple prediction methods, search algorithms, and optimization loss functions were tested and compared. Next, we applied these methods on the cooperative study of sickle cell disease (CSSCD) dataset, examining the genetic basis for cerebrovascular accident (CVA) and fetal hemoglobin level (HbF). To demonstrate the effectiveness ofthe model selection and classification, CVA status was predicted in validation datasets from several other studies. The model search and classification methods described in this thesis are capable of efficient pleiotropic locus identification and phenotype classification under a variety of conditions. These methods are robust and computationally efficient, providing a powerful new approach for detecting and modeling pleiotropic disease loci.
APA, Harvard, Vancouver, ISO, and other styles
22

Taylor, J. G. "Predicting the microbial risk in flooded London dwellings using microbial, hygrothermal, and GIS modelling." Thesis, University College London (University of London), 2012. http://discovery.ucl.ac.uk/1378606/.

Full text
Abstract:
With a changing climate, London is expected to experience more frequent periods of intense rainfall and tidal surges, which will lead to an increase in the risk of flooding. Floodwater may deposit harmful microorganisms on building surfaces, while damp indoor environments in flooded dwellings can support the growth of microorganisms including mould, bacteria, and protozoa. This thesis investigates possible flood-borne and damp-related pathogens in flooded London dwellings, and the potential duration of microbial contamination risk following a flood event. Microbiological laboratory work and models are used to characterise microbial risk within flooded dwellings. Dwelling archetypes representative of the London housing stock are developed and hygrothermal simulation techniques used to model the flooding and drying behaviour of the archetypes under different scenarios in order to predict the duration of damp and microbial risk inside typical dwellings. The results of the combined biological and hygrothermal models are mapped alongside existing flood risk maps in order to predict areas in London susceptible to long-term microbial risk or prolonged displacement following a flood. Highlights of the research findings include the following (i) The persistence of bacterial contaminants on flooded materials is related to the type of floodwater, the drying conditions including temperature and drying rate, and the material drying characteristics, ii) Different dwellings in London have different drying behaviours due to their built form and dominant wall types, with modern purpose-built flats the most prone to long-term damp and microbial risk following a flood event, (iii) The flood height, external weather, and internal conditions including heating and ventilation can have a major impact on the length of time a dwelling will remain at risk of microbial contamination, and (iv) The concentration of properties vulnerable to long-term microbial exposure following major flood events is highest in areas of South and East London, particularly Southwark.
APA, Harvard, Vancouver, ISO, and other styles
23

Geroukis, Asterios, and Erik Brorson. "Predicting Insolvency : A comparison between discriminant analysis and logistic regression using principal components." Thesis, Uppsala universitet, Statistiska institutionen, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-243289.

Full text
Abstract:
In this study, we compare the two statistical techniques logistic regression and discriminant analysis to see how well they classify companies based on clusters – made from the solvency ratio ­– using principal components as independent variables. The principal components are made with different financial ratios. We use cluster analysis to find groups with low, medium and high solvency ratio of 1200 different companies found on the NASDAQ stock market and use this as an apriori definition of risk. The results shows that the logistic regression outperforms the discriminant analysis in classifying all of the groups except for the middle one. We conclude that this is in line with previous studies.
APA, Harvard, Vancouver, ISO, and other styles
24

Näsström, Jens. "Volatility Modelling of Asset Prices using GARCH Models." Thesis, Linköping University, Department of Electrical Engineering, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-1625.

Full text
Abstract:
<p>The objective for this master thesis is to investigate the possibility to predict the risk of stocks in financial markets. The data used for model estimation has been gathered from different branches and different European countries. The four data series that are used in the estimation are price series from: Münchner Rück, Suez-Lyonnaise des Eaux, Volkswagen and OMX, a Swedish stock index. The risk prediction is done with univariate GARCH models. GARCH models are estimated and validated for these four data series. </p><p>Conclusions are drawn regarding different GARCH models, their numbers of lags and distributions. The model that performs best, out-of-sample, is the APARCH model but the standard GARCH is also a good choice. The use of non-normal distributions is not clearly supported. The result from this master thesis could be used in option pricing, hedging strategies and portfolio selection.</p>
APA, Harvard, Vancouver, ISO, and other styles
25

DallaPiazza, Kristin Lee. "A Global Approach to Disease Prevention: Predicting High Risk Areas for West Nile Infection in the Us." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/33083.

Full text
Abstract:
WN virus has spread for over 60 years creating endemic and epidemic areas throughout Africa, Asia, and Europe, affecting human, bird, and equine populations. Its 1999 appearance in New York shows the ability of the virus to cross barriers and travel great distances, emerging into new territories previously free of infection. Spreading much faster than expected, WN virus has infected thousands of birds, equine, and humans throughout the conterminous United States (US). Case and serological studies performed in the Eastern hemisphere prior to 1999 offer detailed descriptions of endemic and epidemic locations in regards to geography, land cover, land use, population, climate, and weather patterns. Based on the severity of WN activity within each study area, the patterns associated with these environmental factors allow for the identification of values associated with different levels of risk. We can then model the landscape of the disease within the US and identify areas of high risk for infection. State and county public health officials can use this model as a decision-making tool to allocate funding for disease prevention and control. Dynamic factors associated with increased transmission, such as above average temperature and precipitation, can be closely monitored and measures of prevention can be implemented when necessary. In turn, detailed information from higher resolution analyses can be documented to an online GIS (Geographic Information System) that would contribute to a global collaboration on outbreaks and prevention of disease.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
26

Heimbigner, Stephen Matthew. "Implications in Using Monte Carlo Simulation in Predicting Cardiovascular Risk Factors among Overweight Children and Adolescents." Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/iph_theses/11.

Full text
Abstract:
The prevalence of overweight and obesity among children and adolescents has increased considerably over the last few decades. As a result, increasing numbers of American children are developing multiple risk factors for cardiovascular disease, type II diabetes, hyperinsulinemia, hypertension, dyslipidemia and hepatic steatosis. This thesis examines the use of Monte Carlo computer simulation for understanding risk factors associated with childhood overweight. A computer model is presented for predicting cardiovascular risk factors among overweight children and adolescents based on BMI levels. The computer model utilizes probabilities from the 1999 Bogalusa Heart Study authored by David S. Freedman, William H. Dietz, Sathanur R. Srinivasan and Gerald S. Berenson. The thesis examines strengths, weaknesses and opportunities associated with the developed model. Utilizing this approach, organizations can insert their own probabilities and customized algorithms for predicting future events.
APA, Harvard, Vancouver, ISO, and other styles
27

Pearson, Dominic. "Breaking the 'glass ceiling' of risk prediction in recidivism : an application of connectionist modelling to offender data." Thesis, University of York, 2011. http://etheses.whiterose.ac.uk/2574/.

Full text
Abstract:
The present thesis explored the capability of connectionist models to break through the 'glass ceiling' of accuracy currently in operation in recidivism prediction (e.g., Yang, Wong, & Coid, 2010). Regardless of the inclusion of dynamic items, all risk measures rarely exceed .75 in terms of the area under the receiver operating characteristic curve (AUC) (Hanley & McNeil, 1982). This may reflect the emphasis of multiple regression equations on main effects of a few key variables tapping long-term anti-social potential. Connectionist models, not used in criminal justice, represent a promising alternative means of combining predictors given their ability to model interactions automatically. To promote learning from other fields a systematic review of the literature on the application of connectionist models to operational data is presented. Lessons were then taken forward in the development of a connectionist model suitable for the present data which comprised fields from the Offender Assessment System (OASys) (Home Office, 2002) relating to 4,048 offenders subject to probation supervision. Included in the items for modelling was the Offender Group Reconviction Scale (OGRS) (Copas & Marshall, 1998; Taylor, 1999). Combining static and dynamic items using conventional statistical methods showed a maximum cross-validated AUC of .82. Using the connectionist model however a substantial increase in accuracy was observed, AUC=.98, and this largely maintained when variations in time to recidivism were controlled. Variation to model parameters suggested that performance linked to the resources in the middle layer, responsible for modelling rare patterns and interactions between items. Model pruning confirmed that while the connectionist model exploited a wide range of variables in its classification decisions, the linear model was affected mainly by OGRS and a limited number of other variables. Results are discussed in terms of the theoretical and practical benefits of developing the use of connectionist models for better incorporating individuals' dynamic risk and protective factors in recidivism assessments, and reducing the costs associated with false classifications.
APA, Harvard, Vancouver, ISO, and other styles
28

Bolajoko, Muhammad-Bashir. "Predictive modelling of ovine haemonchosis risk based on the effects of climate on the free-living stages of H. contortus." Thesis, University of Bristol, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.680138.

Full text
Abstract:
The gastrointestinal nematode parasite Haemonchus contortus is responsible for substantial global disease and production loss in small ruminants. These losses may be exacerbated by climate change, and by increasingly widespread resistance to anthelmintics. Development of successful integrated-sustainable parasite control (IPC) requires adequate knowledge of how climate drives the population dynamics of H. contortus and the seasonal occurrence of ovine haemonchosis. Therefore a simple model based on climate that is able to predict future challenge and risk resulting from H. contortus could greatly contribute to sustainable control. This thesis set forth to develop a simple, universally-applicable and useful model of H. contortus transmission to sheep; with focus on the effects that changes in climate have on the availability of infective larvae. The study aimed to (i) analyse and predict the effect(s) that changes in climate will have on H. contortus infection pressure in sheep across different geo-climatic zones; (ii) map the risk and geographical distribution of H. contortus infection pressure over time; and (iii) provide farmers with useful information on risk of infection to make cost-effective farm management decisions for sustainable control of H contortus. South Africa and the United Kingdom are the study locations. Time series analyses (TSA) was first used, as a purely statistical approach and starting point to assess the seasonal forcing influence that climate (rainfall and temperature) has on the pattern and incidence of haemonchosis. This aimed to find out if a statistical approach can identify valid climatic predictors of the risk of haemonchosis across different geo-climatic zones. Thereafter, a second model, based on the basic reproduction quotient (Qo), which is a process-based mechanistic approach, was employed to understand the effects of changes in climate on the population dynamics (i.e. transmission potential) of the free-living stages, and to predict E-[. contortus infection pressure across different geo-climatic zones. The model tries to replicate and summarize the underlying mechanisms that drive the response of parasite populations to changes in prevailing climatic variables. Finally, the use of the Qo model as a decision SUppOlt tool on farms was assessed by comparing predictions to observed faecal egg counts in south-west England over the course of a grazing season. Results suggest that TSA is able to predict the relationship between prevailing climatic conditions and the incidence of haemonchosis in a given area. However, the climatic predictors and best-fit-model were not transferable across different geo-climatic zones. Local data are needed in order to estimate coefficients for climatic predictors, such that extrapolation beyond the observed range becomes problematic and cumbersome. The Qo model, on the other hand, was able to capture the effects of seasonal variation in the prevailing climate on the pattern and incidence of haemonchosis across different geoclimatic locations. The model was spatially extended within a geographic information system (GIS) to produce Qo -based haemonchosis risk maps. The risk maps display the capability of Qo as a spatial predictor of haemonchosis risk across different geo-climatic zones over time. These risk maps have potential as spatial platforms for decision support systems, in support of integrated, sustainable control of H. contortus.
APA, Harvard, Vancouver, ISO, and other styles
29

Jenkins, Toni E. "Introgression of genes from rape to wild turnip." Lincoln University, 2005. http://hdl.handle.net/10182/1844.

Full text
Abstract:
Introgression of genes from crops into ruderal populations is a multi-step process requiring sympatry, synchronous flowering, chromosomal compatibility, successful pollination and development of the zygote, germination, establishment and reproduction of hybrid progeny. The goal of this thesis was to generate data on as many steps in this process as possible and integrate them into a predictive statistical model to estimate the likelihood of successful introgression under a range of scenarios. Rape (Brassica napus) and wild turnip (B. rapa var. oleifera) were used as a model system. A homozygous dominant mutation in the rape genome conferring herbicide resistance provided a convenient marker for the study of introgression. Potential differences between wild turnip populations from a wide range of geographic locations in New Zealand were examined. Hand pollination established the genetic compatibility of rape and wild turnip and a high potential for gene introgression from rape to wild turnip. Interspecific hybrids were easily generated using wild turnip as the maternal plant, with some minor differences between wild turnip populations. The frequency of successful hybridisation between the two species was higher on the lower raceme. However, the upper raceme produced more dormant interspecific hybrid seed. Field trials, designed to imitate rare rape crop escapes into the ruderal environment, examined the ability of rare rape plants to pollinate wild turnip plants over four summers. At a ratio of 1 rape plant for every 400 wild turnip plants, the incidence of interspecific hybridisation was consistently low (<0.1 to 2.1 % of total seed on wild turnip plants). There was a significant year effect with the first season producing significantly more seed and a greater frequency of interspecific hybrid progeny than the other years. The frequency of interspecific hybrid progeny increases when the ratio of rape: wild turnip plant numbers increases. The relative importance of anemophily and entomophily in the production of interspecific hybrids was examined. Wild turnip plants produced twice as many seeds with bee pollination relative to wind pollination. However, the frequency of interspecific hybrids under wind pollination was nearly twice that for bee pollination. Light reflectance patterns under UV light revealed a marked difference between wild turnip and rape flowers compared to near identical appearance under visible light. The data indicates that bees are able to distinguish between rape and wild turnip flowers and exhibit floral constancy when foraging among populations with these two species. Hybrid survival in the seed bank, germination and seedling establishment in the field are important components of fitness. Seed banks established in the soil after the field trials described above germinated in subsequent spring seasons. The predominantly brassica weed populations were screened for herbicide resistance and the numbers of interspecific hybrids germinating compared to the original frequency in the field trial results. Frequency of interspecific hybrids was reduced in the populations compared to the original seed deposit. Seed with a known frequency of interspecific hybrid seed was sown in a separate trial, and the frequency of interspecific hybrids compared at 0, 4, 6, and 8 weeks after sowing. Poor germination resulted limited competition between seedlings, however the frequency of interspecific hybrids declined over time indicating low plant fitness. There were no significant population effects on any parameters tested. Interspecific hybrids grown in a glasshouse were backcrossed to the parental species and selfed within the plant and within populations. Pollen from the interspecific hybrids was found to have markedly reduced fertility. Interspecific hybrid plants had low female fertility, with the majority (88%) of the pollinated flowers aborting the siliques. Of the remaining siliques, most (98%) had only one to three seeds per silique. Inheritance of the herbicide resistance gene was regular in backcrosses but highly skewed following self pollination with an excess of herbicide-sensitive progeny. Production of a stochastic predictive model integrated the information acquired over the practical work phase of this thesis and utilised the capabilities of @risk, a new application of a risk analysis tool. The three outputs examined were the number of flowering plants resulting from backcrosses to rape and wild turnip and self pollination of the interspecific hybrid progeny. Five scenarios were modelled and all demonstrated the high likelihood of introgression failure in this system. In all scenarios, >75% of simulations resulted in no interspecific hybrid progeny surviving to flowering in the third generation. In all scenarios, and for all three outputs, the seed set on the interspecific hybrids of the second generation was the major factor that limited the number interspecific hybrid progeny surviving to flowering in the third generation.
APA, Harvard, Vancouver, ISO, and other styles
30

Holm, Hannes. "A Framework and Calculation Engine for Modeling and Predicting the Cyber Security of Enterprise Architectures." Doctoral thesis, KTH, Industriella informations- och styrsystem, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-140525.

Full text
Abstract:
Information Technology (IT) is a cornerstone of our modern society and essential for governments' management of public services, economic growth and national security. Consequently, it is of importance that IT systems are kept in a dependable and secure state. Unfortunately, as modern IT systems typically are composed of numerous interconnected components, including personnel and processes that use or support it (often referred to as an enterprise architecture), this is not a simple endeavor. To make matters worse, there are malicious actors who seek to exploit vulnerabilities in the enterprise architecture to conduct unauthorized activity within it. Various models have been proposed by academia and industry to identify and mitigate vulnerabilities in enterprise architectures, however, so far none has provided a sufficiently comprehensive scope. The contribution of this thesis is a modeling framework and calculation engine that can be used as support by enterprise decision makers in regard to cyber security matters, e.g., chief information security officers. In summary, the contribution can be used to model and analyze the vulnerability of enterprise architectures, and provide mitigation suggestions based on the resulting estimates. The contribution has been tested in real-world cases and has been validated on both a component level and system level; the results of these studies show that it is adequate in terms of supporting enterprise decision making. This thesis is a composite thesis of eight papers. Paper 1 describes a method and dataset that can be used to validate the contribution described in this thesis and models similar to it. Paper 2 presents what statistical distributions that are best fit for modeling the time required to compromise computer systems. Paper 3 describes estimates on the effort required to discover novel web application vulnerabilities. Paper 4 describes estimates on the possibility of circumventing web application firewalls. Paper 5 describes a study of the time required by an attacker to obtain critical vulnerabilities and exploits for compiled software. Paper 6 presents the effectiveness of seven commonly used automated network vulnerability scanners. Paper 7 describes the ability of the signature-based intrusion detection system Snort at detecting attacks that are more novel, or older than its rule set. Finally, paper 8 describes a tool that can be used to estimate the vulnerability of enterprise architectures; this tool is founded upon the results presented in papers 1-7.<br>Informationsteknik (IT) är en grundsten i vårt moderna samhälle och grundläggande för staters hantering av samhällstjänster, ekonomisk tillväxt och nationell säkerhet. Det är därför av vikt att IT-system hålls i ett tillförlitligt och säkert tillstånd. Då moderna IT-system vanligen består av en mångfald av olika integrerade komponenter, inklusive människor och processer som nyttjar eller stödjer systemet (ofta benämnd organisationsövergripande arkitektur, eller enterprise architecture), är detta tyvärr ingen enkel uppgift. För att förvärra det hela så finns det även illvilliga aktörer som ämnar utnyttja sårbarheter i den organisationsövergripande arkitekturen för att utföra obehörig aktivitet inom den. Olika modeller har föreslagits av den akademiska världen och näringslivet för att identifiera samt behandla sårbarheter i organisationsövergripande arkitekturer, men det finns ännu ingen modell som är tillräckligt omfattande. Bidraget presenterat i denna avhandling är ett modelleringsramverk och en beräkningsmotor som kan användas som stöd av organisatoriska beslutsfattare med avseende på säkerhetsärenden. Sammanfattningsvis kan bidraget användas för att modellera och analysera sårbarheten av organisationsövergripande arkitekturer, samt ge förbättringsförslag baserat på dess uppskattningar. Bidraget har testats i fallstudier och validerats på både komponentnivå och systemnivå; resultaten från dessa studier visar att det är lämpligt för att stödja organisatoriskt beslutsfattande. Avhandlingen är en sammanläggningsavhandling med åtta artiklar. Artikel 1 beskriver en metod och ett dataset som kan användas för att validera avhandlingens bidrag och andra modeller likt detta. Artikel 2 presenterar vilka statistiska fördelningar som är bäst lämpade för att beskriva tiden som krävs för att kompromettera en dator. Artikel 3 beskriver uppskattningar av tiden som krävs för att upptäcka nya sårbarheter i webbapplikationer. Artikel 4 beskriver uppskattningar för möjligheten att kringgå webbapplikationsbrandväggar. Artikel 5 beskriver en studie av den tid som krävs för att en angripare skall kunna anskaffa kritiska sårbarheter och program för att utnyttja dessa för kompilerad programvara. Artikel 6 presenterar effektiviteten av sju vanligt nyttjade verktyg som används för att automatiskt identifiera sårbarheter i nätverk. Artikel 7 beskriver förmågan av det signatur-baserade intrångsdetekteringssystemet Snort att upptäcka attacker som är nyare, eller äldre, än dess regeluppsättning. Slutligen beskriver artikel 8 ett verktyg som kan användas för att uppskatta sårbarheten av organisationsövergripande arkitekturer; grunden för detta verktyg är de resultat som presenteras i artikel 1-7.<br><p>QC 20140203</p>
APA, Harvard, Vancouver, ISO, and other styles
31

Woolston, C. P. "A method of increasing capacity of an electricity distribution network through predictive modelling and intelligent protection." Thesis, Queensland University of Technology, 2000. https://eprints.qut.edu.au/36121/1/36121_Woolston_2000.pdf.

Full text
Abstract:
This thesis describes the development of a software system which was implemented by the South East Queensland Electricity Board Australia (SEQEB). The software uses temperature probes and mathematical modelling to continuously determine maximum current carrying capacity of underground feeders. It identifies other plant capable of modelling (such as Transformers) and develops a higher level software architecture for which models may be integrated and with which automated overload verification and plant protection may be achieved. The usefulness of this software is that it permits higher currents than usual to be conveyed along cables and allows SEQEB to provide less cabling and still provide the same amount of power, achieving cost benefits. The new software runs over the existing Supervisory Control and Data Acquisition system (SCADA) and because of this, the thesis carries out an appraisal of several commercial SCADA products. This thesis describes the development of the cable-modelling algorithm, the higher level software architecture and describes the design, development and commissioning of this software within South East Electricity Boards' Supervisory Control and Data Acquisition System.
APA, Harvard, Vancouver, ISO, and other styles
32

Ning, Shuluo. "Bayesian Degradation Analysis Considering Competing Risks and Residual-Life Prediction for Two-Phase Degradation." Ohio University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1339559200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Steinmetz, Fabian. "Integration of data quality, kinetics and mechanistic modelling into toxicological assessment of cosmetic ingredients." Thesis, Liverpool John Moores University, 2016. http://researchonline.ljmu.ac.uk/4522/.

Full text
Abstract:
In our modern society we are exposed to many natural and synthetic chemicals. The assessment of chemicals with regard to human safety is difficult but nevertheless of high importance. Beside clinical studies, which are restricted to potential pharmaceuticals only, most toxicity data relevant for regulatory decision-making are based on in vivo data. Due to the ban on animal testing of cosmetic ingredients in the European Union, alternative approaches, such as in vitro and in silico tests, have become more prevalent. In this thesis existing non-testing approaches (i.e. studies without additional experiments) have been extended, e.g. QSAR models, and new non-testing approaches, e.g. in vitro data supported structural alert systems, have been created. The main aspect of the thesis depends on the determination of data quality, improving modelling performance and supporting Adverse Outcome Pathways (AOPs) with definitions of structural alerts and physico-chemical properties. Furthermore, there was a clear focus on the transparency of models, i.e. approaches using algorithmic feature selection, machine learning etc. have been avoided. Furthermore structural alert systems have been written in an understandable and transparent manner. Beside the methodological aspects of this work, cosmetically relevant examples of models have been chosen, e.g. skin penetration and hepatic steatosis. Interpretations of models, as well as the possibility of adjustments and extensions, have been discussed thoroughly. As models usually do not depict reality flawlessly, consensus approaches of various non-testing approaches and in vitro tests should be used to support decision-making in the regulatory context. For example within read-across, it is feasible to use supporting information from QSAR models, docking, in vitro tests etc. By applying a variety of models, results should lead to conclusions being more usable/acceptable within toxicology. Within this thesis (and associated publications) novel methodologies on how to assess and employ statistical data quality and how to screen for potential liver toxicants have been described. Furthermore computational tools, such as models for skin permeability and dermal absorption, have been created.
APA, Harvard, Vancouver, ISO, and other styles
34

Stephen, Jacqueline. "Statistical modelling of biomarkers incorporating non-proportional effects for survival data : with illustration by application to two residual risk models for predicting risk in early breast cancer." Thesis, University of Edinburgh, 2016. http://hdl.handle.net/1842/23390.

Full text
Abstract:
Personalised medicine is replacing the one-drug-fits-all approach with many prognostic models incorporating biomarkers available for risk stratifying patients. Evidence has been emerging that the effects of biomarkers change over time and therefore violate the assumption of proportional hazards when performing Cox regression. Analysis using the Cox model when the assumptions are invalid can result in misleading conclusions. This thesis reviews existing approaches for the analysis of non-proportional effects with respect to survival data. A number of well-developed approaches were identified but to date their uptake in practice has been limited. There is a need for more widespread use of flexible modelling to move away from standard analysis using a Cox model when the assumption of proportional hazards is violated. Two novel approaches were applied to investigate the impact of follow-up duration on two residual risk models, IHC4 and Mammostrat, for predicting risk in early breast cancers using two studies with different lengths of follow up; the Edinburgh Breast Conservation Series (BCS) and the Tamoxifen versus Exemestane Adjuvant Multinational (TEAM) trial. Similar results were observed between the two approaches that were considered, the multivariable fractional polynomial time (MFPT) approach and Royston-Parmer flexible parametric models, with their respective advantages and disadvantages being discussed. The analyses identified a strong time-varying effect of IHC4 score with the prognostic effect of IHC4 score on time-to distant recurrence decreasing with increasing follow-up time. Mammostrat score identified a group of patients with an increased risk of distant recurrence over full follow-up in the TEAM and Edinburgh BCS cohorts. The results suggest a combined IHC4 and Mammostrat risk score could provide information on the risk of recurrence and warrants further study.
APA, Harvard, Vancouver, ISO, and other styles
35

Schaberreiter, T. (Thomas). "A Bayesian network based on-line risk prediction framework for interdependent critical infrastructures." Doctoral thesis, Oulun yliopisto, 2013. http://urn.fi/urn:isbn:9789526202129.

Full text
Abstract:
Abstract Critical Infrastructures (CIs) are an integral part of our society and economy. Services like electricity supply or telecommunication services are expected to be available at all times and a service failure may have catastrophic consequences for society or economy. Current CI protection strategies are from a time when CIs or CI sectors could be operated more or less self-sufficient and interconnections among CIs or CI sectors, which may lead to cascading service failures to other CIs or CI sectors, where not as omnipresent as today. In this PhD thesis, a cross-sector CI model for on-line risk monitoring of CI services, called CI security model, is presented. The model allows to monitor a CI service risk and to notify services that depend on it of possible risks in order to reduce and mitigate possible cascading failures. The model estimates CI service risk by observing the CI service state as measured by base measurements (e.g. sensor or software states) within the CI service components and by observing the experienced service risk of CI services it depends on (CI service dependencies). CI service risk is estimated in a probabilistic way using a Bayesian network based approach. Furthermore, the model allows CI service risk prediction in the short-term, mid-term and long-term future, given a current CI service risk and it allows to model interdependencies (a CI service risk that loops back to the originating service via dependencies), a special case that is difficult to model using Bayesian networks. The representation of a CI as a CI security model requires analysis. In this PhD thesis, a CI analysis method based on the PROTOS-MATINE dependency analysis methodology is presented in order to analyse CIs and represent them as CI services, CI service dependencies and base measurements. Additional research presented in this PhD thesis is related to a study of assurance indicators able to perform an on-line evaluation of the correctness of risk estimates within a CI service, as well as for risk estimates received from dependencies. A tool that supports all steps of establishing a CI security model was implemented during this PhD research. The research on the CI security model and the assurance indicators was validated based on a case study and the initial results suggest its applicability to CI environments<br>Tiivistelmä Tässä väitöskirjassa esitellään läpileikkausmalli kriittisten infrastruktuurien jatkuvaan käytön riskimallinnukseen. Tämän mallin avulla voidaan tiedottaa toisistaan riippuvaisia palveluita mahdollisista vaaroista, ja siten pysäyttää tai hidastaa toisiinsa vaikuttavat ja kumuloituvat vikaantumiset. Malli analysoi kriittisen infrastruktuurin palveluriskiä tutkimalla kriittisen infrastruktuuripalvelun tilan, joka on mitattu perusmittauksella (esimerkiksi anturi- tai ohjelmistotiloina) kriittisen infrastruktuurin palvelukomponenttien välillä ja tarkkailemalla koetun kriittisen infrastruktuurin palveluriskiä, joista palvelut riippuvat (kriittisen infrastruktuurin palveluriippuvuudet). Kriittisen infrastruktuurin palveluriski arvioidaan todennäköisyyden avulla käyttämällä Bayes-verkkoja. Lisäksi malli mahdollistaa tulevien riskien ennustamisen lyhyellä, keskipitkällä ja pitkällä aikavälillä, ja mahdollistaa niiden keskinäisten riippuvuuksien mallintamisen, joka on yleensä vaikea esittää Bayes-verkoissa. Kriittisen infrastruktuurin esittäminen kriittisen infrastruktuurin tietoturvamallina edellyttää analyysiä. Tässä väitöskirjassa esitellään kriittisen infrastruktuurin analyysimenetelmä, joka perustuu PROTOS-MATINE -riippuvuusanalyysimetodologiaan. Kriittiset infrastruktuurit esitetään kriittisen infrastruktuurin palveluina, palvelujen keskinäisinä riippuvuuksina ja perusmittauksina. Lisäksi tutkitaan varmuusindikaattoreita, joilla voidaan tutkia suoraan toiminnassa olevan kriittisen infrastruktuuripalvelun riskianalyysin oikeellisuutta, kuin myös riskiarvioita riippuvuuksista. Tutkimuksessa laadittiin työkalu, joka tukee kriittisen infrastruktuurin tietoturvamallin toteuttamisen kaikkia vaiheita. Kriittisen infrastruktuurin tietoturvamalli ja varmuusindikaattorien oikeellisuus vahvistettiin konseptitutkimuksella, ja alustavat tulokset osoittavat menetelmän toimivuuden<br>Kurzfassung In dieser Doktorarbeit wird ein Sektorübergreifendes Modell für die kontinuierliche Risikoabschätzung von kritische Infrastrukturen im laufenden Betrieb vorgestellt. Das Modell erlaubt es, Dienstleistungen, die in Abhängigkeit einer anderen Dienstleistung stehen, über mögliche Gefahren zu informieren und damit die Gefahr des Übergriffs von Risiken in andere Teile zu stoppen oder zu minimieren. Mit dem Modell können Gefahren in einer Dienstleistung anhand der Überwachung von kontinuierlichen Messungen (zum Beispiel Sensoren oder Softwarestatus) sowie der Überwachung von Gefahren in Dienstleistungen, die eine Abhängigkeit darstellen, analysiert werden. Die Abschätzung von Gefahren erfolgt probabilistisch mittels eines Bayessches Netzwerks. Zusätzlich erlaubt dieses Modell die Voraussage von zukünftigen Risiken in der kurzfristigen, mittelfristigen und langfristigen Zukunft und es erlaubt die Modellierung von gegenseitigen Abhängigkeiten, die im Allgemeinen schwer mit Bayesschen Netzwerken darzustellen sind. Um eine kritische Infrastruktur als ein solches Modell darzustellen, muss eine Analyse der kritischen Infrastruktur durchgeführt werden. In dieser Doktorarbeit wird diese Analyse durch die PROTOS-MATINE Methode zur Analyse von Abhängigkeiten unterstützt. Zusätzlich zu dem vorgestellten Modell wird in dieser Doktorarbeit eine Studie über Indikatoren, die das Vertrauen in die Genauigkeit einer Risikoabschätzung evaluieren können, vorgestellt. Die Studie beschäftigt sich sowohl mit der Evaluierung von Risikoabschätzungen innerhalb von Dienstleistungen als auch mit der Evaluierung von Risikoabschätzungen, die von Dienstleistungen erhalten wurden, die eine Abhängigkeiten darstellen. Eine Software, die alle Aspekte der Erstellung des vorgestellten Modells unterstützt, wurde entwickelt. Sowohl das präsentierte Modell zur Abschätzung von Risiken in kritischen Infrastrukturen als auch die Indikatoren zur Uberprüfung der Risikoabschätzungen wurden anhand einer Machbarkeitsstudie validiert. Erste Ergebnisse suggerieren die Anwendbarkeit dieser Konzepte auf kritische Infrastrukturen
APA, Harvard, Vancouver, ISO, and other styles
36

Přichystalová, Veronika. "Vícekriteriální optimalizace podniku pomocí trendu." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2015. http://www.nusl.cz/ntk/nusl-225276.

Full text
Abstract:
This thesis analyzes the impact of political risk for investment decision-ing-on invest-ments into large investment projects. The failure of investors in the field of large in-vestment projects is the vast majority caused by the politic-social grounds, whose quan-tification is extremely difficult. Political risk affects economic conditions and the stabil-ity of the environment, therefore knowledge of its development is essential for the prop-er investment decisions. His predictions are quantitative level problematic. The method used qualitative modeling falls within the field of artificial intelligence and used to model the trend. The work describes the process of creating qualitative model, its inter-pretation and recommendations for use in investment decisions.
APA, Harvard, Vancouver, ISO, and other styles
37

Evans, Ben Richard. "Data-driven prediction of saltmarsh morphodynamics." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/276823.

Full text
Abstract:
Saltmarshes provide a diverse range of ecosystem services and are protected under a number of international designations. Nevertheless they are generally declining in extent in the United Kingdom and North West Europe. The drivers of this decline are complex and poorly understood. When considering mitigation and management for future ecosystem service provision it will be important to understand why, where, and to what extent decline is likely to occur. Few studies have attempted to forecast saltmarsh morphodynamics at a system level over decadal time scales. There is no synthesis of existing knowledge available for specific site predictions nor is there a formalised framework for individual site assessment and management. This project evaluates the extent to which machine learning model approaches (boosted regression trees, neural networks and Bayesian networks) can facilitate synthesis of information and prediction of decadal-scale morphological tendencies of saltmarshes. Importantly, data-driven predictions are independent of the assumptions underlying physically-based models, and therefore offer an additional opportunity to crossvalidate between two paradigms. Marsh margins and interiors are both considered but are treated separately since they are regarded as being sensitive to different process suites. The study therefore identifies factors likely to control morphological trajectories and develops geospatial methodologies to derive proxy measures relating to controls or processes. These metrics are developed at a high spatial density in the order of tens of metres allowing for the resolution of fine-scale behavioural differences. Conventional statistical approaches, as have been previously adopted, are applied to the dataset to assess consistency with previous findings, with some agreement being found. The data are subsequently used to train and compare three types of machine learning model. Boosted regression trees outperform the other two methods in this context. The resulting models are able to explain more than 95% of the variance in marginal changes and 91% for internal dynamics. Models are selected based on validation performance and are then queried with realistic future scenarios which represent altered input conditions that may arise as a consequence of future environmental change. Responses to these scenarios are evaluated, suggesting system sensitivity to all scenarios tested and offering a high degree of spatial detail in responses. While mechanistic interpretation of some responses is challenging, process-based justifications are offered for many of the observed behaviours, providing confidence that the results are realistic. The work demonstrates a potentially powerful alternative (and complement) to current morphodynamic models that can be applied over large areas with relative ease, compared to numerical implementations. Powerful analyses with broad scope are now available to the field of coastal geomorphology through the combination of spatial data streams and machine learning. Such methods are shown to be of great potential value in support of applied management and monitoring interventions.
APA, Harvard, Vancouver, ISO, and other styles
38

Getley, Ian L. Department of Aviation Faculty of Science UNSW. "Cosmic and solar radiation monitoring of Australian commercial flight crew at high southern latitudes as measured and compared to predictive computer modelling." Awarded by:University of New South Wales, 2007. http://handle.unsw.edu.au/1959.4/40536.

Full text
Abstract:
This study set out to examine the levels of galactic cosmic radiation exposure to Australian aircrew during routine flight operations, with particular attention to the high southern latitude flights between Australia and South Africa. Latitudes as high as 65?? South were flown to gain the data and are typical of the normal flight routes flown between Sydney and Johannesburg on a daily basis. In achieving this objective it became evident that suitable commercially available radiation monitoring equipment was not readily available and scientific radiation monitors were sourced from overseas research facilities to compliment my own FH4lB and Liulin monitors provided by UNSW. At the same time it became apparent that several predictive codes had been developed to attempt to model the radiation doses received by aircrew based on flight route, latitudes and altitudes. Further, it became apparent that these codes had not been subjected to verification at high southern latitudes and that they had not been validated for the effects of solar particle events. Initially measurements were required at the high latitudes followed by mid-latitude data to further balance the PCAIRE code to ensure reasonableness of results for both equatorial and high latitudes. Whilst undertaking this study new scientific monitors became available which provided an opportunity to observe comparative data and results. The Liulin, QDOS and a number of smaller personal dosimeters were subsequently obtained and evaluated. This appears to be the first time that such an extensive cross comparison of these monitors has been conducted over such a wide range of latitudes and altitudes. During the course of this study a fortuitous encounter with GLE 66 enabled several aspects of code validation to be examined, namely the inability of predictive codes to estimate the increased dose associated with a GLE or the effects of a Forbush decrease on the code results. Finally I review the known biological effects as discussed by numerous authors based on current epidemiological studies, with a view to high-lighting were the advent of future technology in aviation may project aircrew dose levels.
APA, Harvard, Vancouver, ISO, and other styles
39

Westerlund, Per. "Condition measuring and lifetime modelling of disconnectors, circuit breakers and other electrical power transmission equipment." Doctoral thesis, KTH, Elektroteknisk teori och konstruktion, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-214984.

Full text
Abstract:
The supply of electricity is important in modern society, so the outages of the electric grid should be few and short, especially for the transmission grid. A summary of the history of the Swedish electrical system is presented. The objective is to be able to plan the maintenance better by following the condition of the equipment. The risk matrix can be used to choose which component to be maintained. The risk matrix is improved by adding a dimension, the uncertainty of the probability. The risk can be reduced along any dimension: better measurements, preventive maintenance or more redundancy. The number of dimensions can be reduced to two by following iso-risk lines calculated for the beta distribution. This thesis lists twenty surveys about circuit breakers and disconnectors, with statistics about the failures and the lifetime. It also presents about forty condition-measuring methods for circuit breakers and disconnectors, mostly applicable to the electric contacts and the mechanical parts. A method for scheduling thermography based on analysis of variance of the current is tried. Its aim is to reduce the uncertainty of thermography and it is able to explain two thirds of the variation using the time of the day, the day of the week and the week number as explanatory variables. However, the main problem remains as the current is in general too low. A system with IR sensors has been installed at the nine contacts of six disconnectors with the purpose of avoiding outages for maintenance if the contacts are in a good condition. The measured temperatures are sent by radio and regressed against the square of the current, the best exponent found. The coefficient of determination $R^2$ is high, greater than 0.9. The higher the regression coefficient is, the more heat is produced at the contact. So this ranks the different contacts. Finally a framework for lifetime modelling and condition measuring is presented. Lifetime modelling consists in associating a distribution of time to failure with each subpopulation. Condition measuring means measuring a parameter and estimating its value in the future. If it exceeds a threshold, maintenance should be carried out. The effect of maintenance of the contacts is shown for four disconnectors. An extension of the risk matrix with uncertainty, a survey of statistics and condition monitoring methods, a system with IR sensors at contacts, a thermography scheduling method and a framework for lifetime modelling and condition measuring are presented. They can improve the planning of outages for maintenance. Finally a framework for lifetime modelling and condition measuring is presented. Lifetime modelling consists in associating a distribution of time to failure with each subpopulation. Condition measuring means measuring a parameter and estimating its value in the future. If it exceeds a threshold, maintenance should be carried out. The effect of maintenance of the contacts is shown for four disconnectors. An extension of the risk matrix with uncertainty, a survey of statistics and condition monitoring methods, a system with IR sensors at contacts, a thermography scheduling method and a framework for lifetime modelling and condition measuring are presented. They can improve the planning of outages for maintenance.<br>Elförsörjningen är viktig i det moderna samhället, så avbrotten bör vara få och korta, särskilt i stamnätet. En kortfattad historik över det svenska elsystemet presenteras. Målet är att kunna planera avbrotten för underhåll bättre genom att veta mera om apparaternas skick. Det är svårt att planera avbrott för underhåll och utbyggnad. Riskmatrisen är verktyg för att välja vad som ska underhållas och den kan förbättras genom att lägga till en dimension, sannolikhetens osäkerhet. Risken kan minskas längs med varje dimension: bättre mätningar, förebyggande underhåll och mer redundans. Antalet dimensioner kan igen bli två genom att följa linjer med samma risk, som är beräknade för betafördelningen. Denna avhandling tar upp tjugo studier av fel i brytare och frånskiljare med data om felorsak och livslängd. Den har också en översikt av ett fyrtiotal olika metoder för tillståndsmätningar för brytare och frånskiljare, som huvudsakligen rör de elektriska kontakterna och de mekaniska delarna. Ett system med IR sensorer har installerats på de nio kontakterna på sex frånskiljare. Målet är att minska antalet avbrott för underhåll genom att skatta skicket när frånskiljarna är i drift. De uppmätta temperaturerna tas emot genom radio och behandlas genom regression mot kvadraten av strömmen, då den bästa exponenten för strömmen visade sig vara 2,0. Förklaringsfaktorn $R^2$ är hög, över 0,9. För varje kontakt ger det en regressionskoefficient. Ju högre koefficienten är, desto mer värme utvecklas det i kontakten, vilket kan leda till skador på materialet. Koefficienterna ger en rangordning av frånskiljarna. Systemet kan också användas för att minska eller öka den tillåtna strömmen baserat på skicket. Slutligen förklaras ett ramverk för livslängdsmodellering och tillståndsmätning. Livslängdsmodellering innebär att koppla en fördelning för tiden till fel med varje delpopulation. Med tillståndsmätning avses att mäta en parameter och skatta dess värde i framtiden. Om den överskrider en tröskel, måste apparaten underhållas. Effekten av underhåll visas för fyra frånskiljare. En utveckling av riskmatrisen med osäkerheten, en sammanställning av statistik och metoder för tillståndsövervakning, ett system med IR-sensor vid kontakerna, en metod för termografiplanering och ett ramverk för livslängdsmodellering och tillståndsmätningar presenteras. De kan förbättra avbrottsplaneringen.<br>El suministro de energía eléctrica es importante en la sociedad moderna. Por eso los cortes eléctricos deben ser poco frecuentes y de poca duración, sobre todo en la red de transmisión. Esta tesis resume la historia del sistema eléctrico sueco. El objetivo es planificar los cortes mejor siguiendo la condición de los aparatos. La matriz de riesgo se utiliza muchas veces para escoger en qué aparatos debería realizarse mantenimiento. Esta matriz se puede mejorar añadiendo una dimensión: la incertidumbre de la probabilidad. El riesgo puede ser disminuido siguiendo cada una de las tres dimensiones: mejores mediciones, mantenimiento preventivo y mayor redundancia. El número de dimensiones puede reducirse siguiendo líneas del mismo riesgo calculadas para la distribución beta. Esta tesis presenta veinte estudios de fallos en interruptores y seccionadores con datos sobre la causa y el tiempo hasta la avería. Contiene también una visión general de cuarenta métodos para medir la condición de seccionadores e interruptores, aplicables en su mayoría a los contactos eléctricos y los componentes mecánicos. Se ha instalado un sistema con sensores infrarrojos en los seis contactos de nueve seccionadores. El objetivo es disminuir los cortes de servicio para mantenimiento, estimando la condición con el seccionador en servicio. Las temperaturas son transmitidas por radio y se hace una regresión con el cuadrado de la corriente, ya que el mejor exponente de la corriente resultó ser 2,0. $R^2$ alcanza un valor de 0,9 indicando un buen ajuste de los datos por parte del modelo. Existe un coeficiente de regresión para cada contacto y este sirve para ordenar los contactos según la necesidad de mantenimiento, ya que cuanto mayor sea el coeficiente más calor se produce en el contacto. Finalmente se explica que el modelado de tiempo hasta la avería consiste en asignar una distribución estadística a cada equipo. La monitorización del estado consiste en medir y estimar un parámetro y luego predecir su valor en el futuro. Si va a sobrepasar un cierto límite, el equipo necesitará de mantenimiento. Se presenta el efecto de mantenimiento de cuatro seccionadores. Un desarrollo de la matriz de riesgo, un conjunto de estadísticas y métodos de monitoreo de condición, un sistema de sensores IR situados cerca de los contactos, en método de planificación de termografía y un concepto para explicar la modelización de tiempo hasta la avería y de la monitorización de la condición han sido presentados y hace posible una mejor planificación de los cortes de servicio.<br><p>QC 20170928</p>
APA, Harvard, Vancouver, ISO, and other styles
40

Lebedžinskaitė, Renata. "Bankroto diagnozavimo įmonėse tyrimas." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2007. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2007~D_20070816_153211-47644.

Full text
Abstract:
Darbo tikslas – sukurti modifikuotą bankroto diagnozavimo modelį ir jį patikrinti Lietuvos įmonių pavyzdžiu. Darbo uždaviniai: 1) ištyrus bankroto veiksnius, nustatyti jo atsiradimo priežastis; 2) atlikti teorinių bankroto tikimybės įvertinimo modelių analizę; 3) atlikus bankroto diagnozavimo modelių teorinę analizę, sukurti ir patikrinti modifikuotą bankroto diagnozavimo modelį. Darbo objektas – bankroto diagnozavimas. Tyrimo metodai: mokslinės literatūros analizė; loginė lyginamoji analizė bei sintezė; dokumentų, turinio analizė, apibendrinimo metodas; statistinė įmonių finansinių rodiklių analizė. Nagrinėjant įvairių autorių mokslinius veikalus apie bankroto diagnozavimo būtinumą, bankroto atsiradimo priežastis, ištirta bankroto tikimybė 6 atsitiktinai pasirinktose įmonėse, remiantis pinigų srautų analizės, E.I.Altmano modeliais bei sukurtas ir išanalizuotas modifikuotas bankroto diagnozavimo modelis.<br>Object of work – bankruptcy diagnostic. Aim of work – to create modificated bankruptcy diagnostic models and check its use in practise throught Lithuanian companies. Tasks of work: 1) Investigate bankruptcy elements and indentify reasons of bunkraptcy origin 2) To make theoretical analyzes of bankruptcy diagnostic models; 3) To create and develop modificated bankruptcy diagnostic model. The research methods: the analysis of scientific literature, the methods of logistic comparison, the methods of synthesis, content and documents methods, the methods of generalization, statistical financial analyzes. There were analyzed scientific works of various authors about nessecisity to predict bankropt, the reasons of bankruptcy nascency, investigated bankruptcy probability in 6 companies using cash flow, E.I.Altman models and was created and inquired modified bankruptcy model.
APA, Harvard, Vancouver, ISO, and other styles
41

Elliott, Patrick. "Evaluating Sea-Level Rise Hazards on Coastal Archaeological Sites, Trinity Bay, Texas." Thesis, University of North Texas, 2018. https://digital.library.unt.edu/ark:/67531/metadc1157575/.

Full text
Abstract:
This study uses the predictive modeling program Sea-Levels Affecting Marshes Model (SLAMM) to evaluate sea-level rise hazards, such as erosion and inundation, on coastal archaeological sites with a vertical rise of sea level of .98 meters from 2006 to 2100. In total 177 archaeological site locations were collected and georeferenced over GIS outputs maps of wetlands, erosion presence, surface elevation, and accretion. Wetlands data can provide useful information about characteristics of the wetland classes, which make a difference in the ability for coastal archaeological sites to combat sea level rise. Additionally, the study evaluated predicted erosion of archaeological sites by presence or absence of active erosion on a cell-by-cell basis. Elevation map outputs relative to mean tide level allowed for a calculation of individual archaeological site datums to use NOAA tidal databases to identify the potential for their inundation. Accretion maps acquired from the SLAMM run determined the potential for the archaeological site locations to combat rising sea levels and potentially provide protection from wave effects. Results show that the most significant hazard predicted to affect coastal archaeological sites is inundation. Approximately 54% of the total archaeological sites are predicted to be inundated at least half the time by 2100. The hazard of erosion, meanwhile, is expected to affect 33% of all archaeological sites by the end of the century. Although difficult to predict, the study assumes that accretion will not be able to keep pace with sea-level rise. Such findings of hazards prove that SLAMM is a useful tool for predicting potential effects of sea-level rise on coastal archaeological sites. With its ability to customize and as it is complementary, it provides itself not only an economical choice but also one that is adaptable to many scenarios.
APA, Harvard, Vancouver, ISO, and other styles
42

DI, NAPOLI MARIANO. "Spatial prediction of landslide susceptibility/intensity through advanced statistical approaches implementation: applications to the Cinque Terre (Eastern Liguria, Italy)." Doctoral thesis, Università degli studi di Genova, 2022. http://hdl.handle.net/11567/1076506.

Full text
Abstract:
Landslides are frequently responsible for considerable huge economic losses and casualties in mountainous regions especially nowadays as development expands into unstable hillslope areas under the pressures of increasing population size and urbanization (Di Martire et al. 2012). People are not the only vulnerable targets of landslides. Indeed, mass movements can easily lay waste to everything in their path, threatening human properties, infrastructures and natural environments. Italy is severely affected by landslide phenomena and it is one of the most European countries affected by this kind of phenomena. In this framework, Italy is particularly concerned with forecasting landslide effects (Calcaterra et al. 2003b), in compliance with the National Law n. 267/98, enforced after the devastating landslide event of Sarno (Campania, Southern Italy). According to the latest Superior Institute for the Environmental Protection and Research (ISPRA, 2018) report on "hydrogeological instability" of 2018, it emerges that the population exposed to landslides risk is more than 5 million and in particular almost half-million falls into very high hazard zones. The slope stability can be compromised by both natural and human-caused changes in the environment. The main reasons can be summarised into heavy rainfalls, earthquakes, rapid snow-melts, slope cut due to erosions, and variation in groundwater levels for the natural cases whilst slopes steepening through construction, quarrying, building of houses, and farming along the foot of mountainous zone correspond to the human component. This Ph.D. thesis was carried out in the Liguria region, inside the Cinque Terre National Park. This area was chosen due to its abundance of different types of landslides and its geological, geomorphological and urban characteristics. The Cinque Terre area can be considered as one of the most representative examples of human-modified landscape. Starting from the early centuries of the Middle Ages, local farmers have almost completely modified the original slope topography through the construction of dry-stone walls, creating an outstanding terraced coastal landscape (Terranova 1984, 1989; Terranova et al. 2006; Brandolini 2017). This territory is extremely dynamic since it is characterized by a complex geological and geomorphological setting, where many surficial geomorphic processes coexist, along with peculiar weather conditions (Cevasco et al. 2015). For this reason, part of this research focused on analyzing the disaster that hit the Cinque Terre on October, 25th, 2011. Multiple landslides took place in this occasion, triggering almost simultaneously hundreds of shallow landslides in the time-lapse of 5-6 hours, causing 13 victims, and severe structural and economic damage (Cevasco et al. 2012; D’Amato Avanzi et al. 2013). Moreover, this artificial landscape experienced important land-use changes over the last century (Cevasco et al. 2014; Brandolini 2017), mostly related to the abandonment of agricultural activity. It is known that terraced landscapes, when no longer properly maintained, become more prone to erosion processes and mass movements (Lesschen et al. 2008; Brandolini et al. 2018a; Moreno-de-las-Heras et al. 2019; Seeger et al. 2019). Within the context of slope instability, the international community has been focusing for the last decade on recognising the landslide susceptibility/hazard of a given area of interest. Landslide susceptibility predicts "where" landslides are likely to occur, whereas, landslide hazard evaluates future spatial and temporal mass movement occurrence (Guzzetti et al., 1999). Although both definitions are incorrectly used as interchangeable. Such a recognition phase becomes crucial for land use planning activities aimed at the protection of people and infrastructures. In fact, only with proper risk assessment governments, regional institutions, and municipalities can prepare the appropriate countermeasures at different scales. Thus, landslide susceptibility is the keystone of a long chain of procedures that are actively implemented to manage landslide risk at all levels, especially in vulnerable areas such as Liguria. The methods implemented in this dissertation have the overall objective of evaluating advanced algorithms for modeling landslide susceptibility. The thesis has been structured in six chapters. The first chapter introduces and motivates the work conducted in the three years of the project by including information about the research objectives. The second chapter gives the basic concepts related to landslides, definition, classification and causes, landslide inventory, along with the derived products: susceptibility, hazard and risk zoning, with particular attention to the evaluation of landslide susceptibility. The objective of the third chapter is to define the different methodologies, algorithms and procedures applied during the research activity. The fourth chapter deals with the geographical, geological and geomorphological features of the study area. The fifth chapter provides information about the results of the applied methodologies to the study area: Machine Learning algorithms, runout method and Bayesian approach. Furthermore, critical discussions on the outcomes obtained are also described. The sixth chapter deals with the discussions and the conclusions of this research, critically analysing the role of such work in the general panorama of the scientific community and illustrating the possible future perspectives.
APA, Harvard, Vancouver, ISO, and other styles
43

Chabeau, Lucas. "Développement et validation d’un outil multivarié de prédiction dynamique d’un échec de greffe rénale." Electronic Thesis or Diss., Nantes Université, 2024. http://www.theses.fr/2024NANU1033.

Full text
Abstract:
Pour de nombreuses pathologies chroniques, la prédiction dynamique d’un événement clinique d’intérêt peut être utile dans une démarche de médecine personnalisée. Dans un tel contexte, les pronostics peuvent être mis à jour tout au long du suivi du patient, à chaque nouvelle information relevée. Ce travail de thèse CIFRE en collaboration avec Sêmeia, consiste à développer et valider un outil de prédiction dynamique de l’échec de greffe rénale. L’outil proposé prédit l’échec de greffe rénale, en compétition avec le décès avec greffon fonctionnel à un horizon de cinq ans. La prédiction est estimée à partir d’informations disponibles à l’inclusion du patient et de trois marqueurs biologiques collectés au cours de son suivi (créatininémie, protéinurie et anticorps antidonneur de type II) permettant d’actualiser le pronostic. Cet outil a été validé pour des temps de prédictions compris entre 1 an et 6 ans posttransplantation. Cette thèse a fait l’objet de trois travaux originaux. Un premier travail, a consisté à développer une procédure d’inférence pour estimer un modèle conjoint pour données longitudinales et données de survie compatible avec l’outil de prédiction. Nous avons mené dans un second travail, une réflexion autours de l’hétérogénéité de la définition de l’horizon de prédiction dans la littérature relative aux prédictions dynamiques. Enfin, nous présentons la construction et la validation du modèle de prédiction dynamique de l’échec de greffe rénale. Le modèle a présenté de bonnes capacités de discrimination et de calibration<br>For many chronic diseases, dynamic prediction of a clinical event of interest can be useful in personalised medicine. In this context, prognoses can be updated throughout the patient's follow-up, as new information becomes available. This CIFRE doctoral thesis, in collaboration with Sêmeia, involves developing and validating a dynamic prediction tool for kidney graft failure. The proposed tool predicts kidney graft failure, in competition with death with a functional graft, over a five-year time horizon. The prediction is based on information available at patient inclusion and three biological markers collected during follow-up (serum creatinine, proteinuria and type II donor-specific antibodies). allowing to update the prognosis. This tool has been validated for prediction times of between 1 and 6 years post-transplant. This doctoral thesis was the subject of three original projects. The first involved developing an inference procedure to estimate a joint model for longitudinal and survival data compatible with the prediction tool. Secondly, we examined the heterogeneity in the definition of prediction horizon in the dynamic prediction literature. Finally, we present the construction and validation of a dynamic prediction model for renal transplant failure. The model showed good discrimination and calibration
APA, Harvard, Vancouver, ISO, and other styles
44

Weber, Denis [Verfasser]. "Measuring and predicting the effects of time-variable exposure of pesticides on populations of green algae : combination of flow-through studies and ecological modelling as an innovative tool for refined risk assessments / Denis Weber." Aachen : Hochschulbibliothek der Rheinisch-Westfälischen Technischen Hochschule Aachen, 2013. http://d-nb.info/103111565X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

DUGATO, MARCO. "L'INTERAZIONE TRA LE CARATTERISTICHE DEI QUARTIERI E L'AMBIENTE FISICO NELLA DETERMINAZIONE DELLA VULNERABILITÀ AL CRIMINE NEI MICROLUOGHI. PROVE EMPIRICHE DA UNA VALUTAZIONE SPAZIALE MULTILIVELLO DEL RISCHIO DI CRIMINALITÀ A MILANO, IT E IZTAPALAPA, MX." Doctoral thesis, Università Cattolica del Sacro Cuore, 2021. http://hdl.handle.net/10280/98602.

Full text
Abstract:
Diverse teorie si concentrano sui legami tra criminalità e caratteristiche specifiche di luoghi e comunità. Tuttavia, solo pochi studi applicati sostengono esplicitamente che i fattori contestuali possono combinarsi nel determinare il rischio di criminalità e che le loro influenze criminogene possono operare su scala diversa. Questo studio si propone di indagare come alcune caratteristiche del paesaggio urbano (microlivello) interagiscono tra loro, nonché con le caratteristiche demografiche, economiche e sociali dell'ambiente dei quartieri circostanti (livello meso), per determinare la vulnerabilità spaziale alla criminalità e, in definitiva, la probabilità di un evento criminale. Questo studio conduce una valutazione del rischio di criminalità spaziale per rapine e crimini violenti in due grandi aree urbane: Milano, Italia e Iztapalapa, Messico. I casi di studio sono focalizzati su due paesi molto diversi, il che consente sia la valutazione dell'influenza di effetti contestuali più ampi (livello macro) sia la verifica di alcuni presupposti teorici al di fuori dell'ambiente anglosassone. L'analisi si fonda sull'approccio del Risk Terrain Modeling. Tuttavia, contrariamente alle applicazioni precedenti, l'analisi in questo studio si basa su un modello di regressione multilivello che include termini di interazione. Lo studio propone inoltre metodi innovativi attraverso i quali esporre e comunicare i propri risultati. Nel complesso, i risultati dimostrano che fattori contestuali misurati a diverse scale geografiche interagiscono in modo significativo tra loro per determinare il rischio di criminalità. Questa scoperta suggerisce di combinare input provenienti da diverse teorie al fine di comprendere le dinamiche alla base del verificarsi del crimine. Inoltre, il metodo proposto generalmente consente di prevedere meglio i crimini futuri e consente la generazione di narrazioni di rischio più precise per informare politiche e interventi.<br>Several theories focus on the links between crime and specific characteristics of places and communities. However, only a few applied studies explicitly purport that contextual factors may combine in determining crime risk and that their criminogenic influences may operate at different geographical scales. This study aims to investigate how certain features of the urban landscape (micro-level) interact with each other, as well as with demographic, economic and social characteristics of the surrounding neighbourhoods (meso-level), to determine spatial vulnerability to crime and, ultimately, the likelihood of a criminal event. This study conducts a spatial crime risk assessment for robberies and violent crimes in two large urban areas: Milan, Italy and Iztapalapa, Mexico. The case studies are focused on two very different countries, which allows for both the assessment of the influence of broader contextual effects (macro-level) and to test certain theoretical assumptions outside the Anglo-Saxon environment. The analysis is grounded in the Risk Terrain Modeling approach. However, in contrast to previous applications, the analysis in this study relies on a multi-level regression model including interaction terms. The study also proposes innovative methods through which to display and communicate its findings. Overall, the results demonstrate that contextual factors measured at different geographical scales interact significantly among them to determine crime risk. This finding suggests combining inputs from different theories in order to understand the dynamics behind crime occurrence. Furthermore, the proposed method generally allows us to better predict the locations of future crimes and enables the generation of more precise risk narratives to inform policies and interventions.
APA, Harvard, Vancouver, ISO, and other styles
46

DUGATO, MARCO. "L'INTERAZIONE TRA LE CARATTERISTICHE DEI QUARTIERI E L'AMBIENTE FISICO NELLA DETERMINAZIONE DELLA VULNERABILITÀ AL CRIMINE NEI MICROLUOGHI. PROVE EMPIRICHE DA UNA VALUTAZIONE SPAZIALE MULTILIVELLO DEL RISCHIO DI CRIMINALITÀ A MILANO, IT E IZTAPALAPA, MX." Doctoral thesis, Università Cattolica del Sacro Cuore, 2021. http://hdl.handle.net/10280/98602.

Full text
Abstract:
Diverse teorie si concentrano sui legami tra criminalità e caratteristiche specifiche di luoghi e comunità. Tuttavia, solo pochi studi applicati sostengono esplicitamente che i fattori contestuali possono combinarsi nel determinare il rischio di criminalità e che le loro influenze criminogene possono operare su scala diversa. Questo studio si propone di indagare come alcune caratteristiche del paesaggio urbano (microlivello) interagiscono tra loro, nonché con le caratteristiche demografiche, economiche e sociali dell'ambiente dei quartieri circostanti (livello meso), per determinare la vulnerabilità spaziale alla criminalità e, in definitiva, la probabilità di un evento criminale. Questo studio conduce una valutazione del rischio di criminalità spaziale per rapine e crimini violenti in due grandi aree urbane: Milano, Italia e Iztapalapa, Messico. I casi di studio sono focalizzati su due paesi molto diversi, il che consente sia la valutazione dell'influenza di effetti contestuali più ampi (livello macro) sia la verifica di alcuni presupposti teorici al di fuori dell'ambiente anglosassone. L'analisi si fonda sull'approccio del Risk Terrain Modeling. Tuttavia, contrariamente alle applicazioni precedenti, l'analisi in questo studio si basa su un modello di regressione multilivello che include termini di interazione. Lo studio propone inoltre metodi innovativi attraverso i quali esporre e comunicare i propri risultati. Nel complesso, i risultati dimostrano che fattori contestuali misurati a diverse scale geografiche interagiscono in modo significativo tra loro per determinare il rischio di criminalità. Questa scoperta suggerisce di combinare input provenienti da diverse teorie al fine di comprendere le dinamiche alla base del verificarsi del crimine. Inoltre, il metodo proposto generalmente consente di prevedere meglio i crimini futuri e consente la generazione di narrazioni di rischio più precise per informare politiche e interventi.<br>Several theories focus on the links between crime and specific characteristics of places and communities. However, only a few applied studies explicitly purport that contextual factors may combine in determining crime risk and that their criminogenic influences may operate at different geographical scales. This study aims to investigate how certain features of the urban landscape (micro-level) interact with each other, as well as with demographic, economic and social characteristics of the surrounding neighbourhoods (meso-level), to determine spatial vulnerability to crime and, ultimately, the likelihood of a criminal event. This study conducts a spatial crime risk assessment for robberies and violent crimes in two large urban areas: Milan, Italy and Iztapalapa, Mexico. The case studies are focused on two very different countries, which allows for both the assessment of the influence of broader contextual effects (macro-level) and to test certain theoretical assumptions outside the Anglo-Saxon environment. The analysis is grounded in the Risk Terrain Modeling approach. However, in contrast to previous applications, the analysis in this study relies on a multi-level regression model including interaction terms. The study also proposes innovative methods through which to display and communicate its findings. Overall, the results demonstrate that contextual factors measured at different geographical scales interact significantly among them to determine crime risk. This finding suggests combining inputs from different theories in order to understand the dynamics behind crime occurrence. Furthermore, the proposed method generally allows us to better predict the locations of future crimes and enables the generation of more precise risk narratives to inform policies and interventions.
APA, Harvard, Vancouver, ISO, and other styles
47

Tetour, Daniel. "Optimalizační modely rizik v energetických systémech." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2020. http://www.nusl.cz/ntk/nusl-414191.

Full text
Abstract:
The diploma thesis deals with mathematical modeling of the resource allocation problem in an energy system with respect to technical parameters of the used resources. The model includes random input variables affecting the amount of demand and constraints related to associated risks. The thesis addresses control of the operation of various types of boilers and also extends the system with a heat storage tank examining its impact on the behavior of the system and achieved results. The optimization model is based on a multi-period two-stage scenario model of stochastic programming and works with simulated data, which combines real data, statistically determined estimates, and the use of logistic regression. The implementation utilizes GAMS software. When comparing the achieved results with the current state, it was found that the heat storage tank has a positive effect on the function of the system as it allows for extended usage of the cheaper unregulated sources by storing surplus heat, and thus helps to reduce the overall costs of the system.
APA, Harvard, Vancouver, ISO, and other styles
48

Chavy, Agathe Corinne. "Influence de l'environnement sur le cycle de transmission de la leishmaniose cutanée en Guyane, à multi-échelle spatiale Ecological niche modelling for predicting the risk of cutaneous leishmaniasis in the Neotropical moist forest biome Identification of French Guiana sand flies using MALDI-TOF mass spectrometry with a new mass spectra library « Regional scale ecological drivers of sandfly communities in French Guiana." Thesis, Guyane, 2019. http://www.theses.fr/2019YANE0013.

Full text
Abstract:
Les cycles de transmission des maladies zoonotiques et les facteurs qui les influencent sont difficiles à déterminer, particulièrement lorsqu’ils sont dus à des agents pathogènes généralistes dépendant de plusieurs espèces hôtes et vectrices pour être transmis. De plus, perturbations anthropiques et changements climatiques exercent de fortes pressions sur les systèmes hôtes-pathogène-vecteurs pouvant modifier les cycles de transmission. Une approche globale à différentes échelles spatiales est alors nécessaire pour caractériser et quantifier l’importance relative de ses facteurs. Cette approche a été utilisée pour étudier l’écologie du cycle de transmission de la leishmaniose cutanée (LC) en Guyane, une maladie vectorielle sylvatique avec de multiples hôtes et vecteurs. Ce cycle, soumis à des pressions anthropiques grandissantes, a vu sa dynamique se modifier, ce qui a entrainé une augmentation du risque de transmission aux populations humaines. Dans cette thèse, nous avons étudié l’influence des facteurs environnementaux, climatiques et anthropiques, à l’échelle globale du biome amazonien et régionale de la Guyane sur la distribution des cas humains de LC, en utilisant des modèles de niches écologiques. Puis, grâce à l’utilisation du séquençage à haut débit et d’outils probabilistes, nous avons observé la réponse des communautés de vecteurs à une échelle régionale dans des sites forestiers soumis à différents degrés de perturbation. Enfin, nous avons contribué à l’amélioration de la gamme d’outils disponibles pour l’identification des phlébotomes en utilisant le MALDI-TOF MS. Cette thèse a permis d’améliorer les connaissances générales du cycle de la LC en Guyane<br>For many zoonotic diseases, transmission cycles remain difficult to determine, especially when they are due to generalist pathogens that can rely on several host and vector species to be transmitted. In addition, anthropogenic disturbances and climate change have strong impacts on ecosystems and can alter pathogen transmission cycles. Characterization and quantification of the relative importance of factors influencing host-pathogen-vector systems is then central for a global approach aiming to understand pathogen dynamics at different spatial scales. This approach has been used to study the ecology of the transmission cycle of cutaneous leishmaniasis (CL) in French Guiana. This vector-born disease, mainly sylvatic and including multiple hosts and vectors, is influenced by strong anthropic pressures that modified the dynamics of the cycle and led to an increase in the risk of transmission to human populations. In this work, we first explored the influence of environmental, climatic and anthropic factors on the distribution of human cases of CL at the global scale of the Amazonian biome and at the regional scale of French Guiana, using ecological niche modelling, allowing building risk maps. Then we observed the responses of communities of sandflies and known vectors at the regional scale in forest sites facing different disturbance levels. This work was made possible using a metabarcoding approach with high throughput sequencing. Last, we contributed to the improvement of the range of tools available for the identification of sandflies using the MALDI-TOF MS. This thesis contributed to the improvement of the general knowledge of the CL cycle in French Guiana
APA, Harvard, Vancouver, ISO, and other styles
49

Jagadesh, Soushieta. "Biogeography of Emerging Infectious Diseases In search for the hotspots of Disease X: A biogeographic approach to mapping the predictive risk of WHO’s Blueprint Priority Diseases Emerging human infectious diseases of aquatic origin: a comparative biogeographic approach using Bayesian spatial modelling Global emergence of Buruli Ulcer Spatial variations between Leishmania species: A biogeographic approach to mapping the distribution of Leishmania species in French Guiana Mapping priority neighborhoods: A novel approach to cluster identification in HIV/AIDS population." Thesis, Guyane, 2020. http://www.theses.fr/2020YANE0007.

Full text
Abstract:
La récente pandémie de Covid19 nous rappelle, si cela était encore nécessaire, que la propagation des maladies infectieuses ignore les frontières géographiques. Les changements combinés de biodiversité locale et l’utilisation des terres, l’augmentation de la connectivité internationale par le transport et le commerce ainsi que la menace imminente du changement climatique a accru le risque d’émergence et de réémergence des maladies infectieuses (EMI). Jusqu’à présent la réponse des politiques de santé publique a été la surveillance passive sans toutefois s’avérer réellement efficace dans la prévention et le contrôle des épidémies. Le choix qui a été fait ici est celui d’une nouvelle approche anticipative, par identification des zones à haut risques d’EMI en se basant sur la détection des facteurs environnementaux les plus favorisant. Parmi ces facteurs on trouve la conversion des terres, la diminution drastique de la biodiversité ou encore le changement climatique. Ainsi la méthode biogéographique a permis d’étudier et d’analyser les EMI à travers différents groupes de taxons de pathogènes comme les bactéries, les virus, les protozoaires et les champignons. L’étude a été portée globalement, ainsi que localement, en Guyane Française, un territoire français d’outre-mer situé en Amérique du Sud. Dans les deux cas, à travers les différents groupes de pathogènes, les risques d’inondation, les récentes conversions de parcelles de forêts en terres agro-minières et l’augmentation du minimum de température due au changement climatique se sont avérés être des facteurs significatifs dans l’émergence globale et locale des maladies infectieuses étudiées. Les principaux résultats de cette thèse sont les suivantes :1. Une approche biogéographique de modélisation de la distribution des EMI en utilisant les bases de données existantes sur les cas cliniques, l’imagerie satellite et un modèle statistique non conventionnel est efficace pour détecter précocement les régions à risque, permettre d’améliorer la prévention, et contrôler leur diffusion.2. Il est possible d’anticiper les EMI en identifiant et en gérant précocement les facteurs favorisant ayant un lien direct avec l’anthropisation de l’environnement<br>The COVID-19 pandemic highlights that the spread of infectious diseases goes beyond geographical boundaries. Simultaneous changes in local biodiversity and land use, the increasing international connectivity through human transport and trade and the imminent threat of climate change have increased the risk of the emergence and reemergence of infectious diseases. The current public health response to emerging infectious diseases (EID) by passive surveillance has proven largely ineffective in preventing and controlling disease outbreaks. The way toward is to “get ahead of the curve” by identifying potential hotspots of disease emergence and detecting the environmental triggers such as land transformation, biodiversity loss and climate change. I used a biogeographic approach to study and analyze disease emergence across different taxonomic pathogen groups such as bacterial, viral, protozoal and fungal, globally and in French Guiana, a French Overseas territory located in South America. I found that regions at risk of floods, recent conversion of forest to agricultural lands and increasing minimum temperature (i.e. temperature at night) caused by cli mate change were drivers for disease emergence locally and globally across the different pathogen groups. The main findings of the PhD thesis are the following:1. Biogeographic approach to mapping the distribution of EIDs with using existing human cases data, remote sensing imagery and unconventional statistical models is effective to “get ahead of the curve” in the detection of regions at risk and the management of EIDs.2. EIDs are not unprecedented but predictable by identifying and managing the triggers of disease emergence, which have a direct link with the anthropization of the environment
APA, Harvard, Vancouver, ISO, and other styles
50

Deobarro, Mikaël. "Etude de l'immunité des circuits intégrés face aux agressions électromagnétiques : proposition d'une méthode de prédiction des couplages des perturbations en mode conduit." Thesis, Toulouse, INSA, 2011. http://www.theses.fr/2011ISAT0002/document.

Full text
Abstract:
Avec les progrès technologiques réalisés au cours de ces dernières décennies, la complexité et les vitesses de fonctionnement des circuits intégrés ont beaucoup été augmentées. Bien que ces évolutions aient permis de diminuer les dimensions et les tensions d’alimentations des circuits, la compatibilité électromagnétique (CEM) des composants a fortement été dégradée. Identifiée comme étant un verrou technologique, la CEM est aujourd’hui l’une des principales causes de « re-design » des circuits car les problématiques liées aux mécanismes de génération et de couplage du bruit ne sont pas suffisamment étudiées lors de leur conception.Ce manuscrit présente donc une méthodologie visant à étudier la propagation du bruit à travers les circuits intégrés par mesures et par simulations. Afin d’améliorer nos connaissances sur la propagation d’interférences électromagnétiques (IEM) et les mécanismes de couplage à travers les circuits, nous avons conçu un véhicule de test développé dans la technologie SMOS8MV® 0,25 µm de Freescale Semiconductor. Dans ce circuit, plusieurs fonctions élémentaires telles qu’un bus d’E/S et des blocs numériques ont été implémentées. Des capteurs de tensions asynchrones ont également été intégrés sur différentes alimentations de la puce pour analyser la propagation des perturbations injectées sur les broches du composant (injection DPI) et sur les conducteurs permettant d’alimenter ce dernier (injection BCI). En outre, nous proposons différents outils pour faciliter la modélisation et les simulations d’immunité des circuits intégrés (extraction des modèles de PCB, approches de modélisation des systèmes d’injection, méthode innovante permettant de prédire et de corréler les niveaux de tension/ de puissance injectés lors de mesures d’immunité conduite, flot de modélisation). Chaque outil et méthode de modélisation proposés sont évalués sur différents cas test. Enfin, pour évaluer notre démarche de modélisation, nous l’appliquons sur un bloc numérique de notre véhicule de test et comparons les résultats de simulations aux différentes mesures internes et externes réalisées sur le circuit<br>With technological advances in recent decades, the complexity and operating speeds of integrated circuits have greatly increased. While these developments have reduced dimensions and supply voltages of circuits, electromagnetic compatibility (EMC) of components has been highly degraded. Identified as a technological lock, EMC is now one of the main causes of circuits re-designs because issues related to generating and coupling noise mechanisms are not sufficiently studied during their design. This manuscript introduces a methodology to study propagation of electromagnetic disturbances through integrated circuits by measurements and simulations. To improve our knowledge about propagation of electromagnetic interferences (EMI) and coupling mechanisms through integrated circuits, we designed a test vehicle developed in the SMOS8MV® 0.25µm technology from Freescale Semiconductor. In this circuit, several basic functions such as I/O bus and digital blocks have been implemented. Asynchronous on-chip voltage sensors have also been integrated on different supplies of the chip to analyze propagation of disturbances injected on supply pins and wires of the component (DPI and BCI injection). In addition, we propose various tools to facilitate modeling and simulations of Integrated Circuit’s immunity (PCB model extraction, injection systems modeling approaches, innovative method to predict and correlate levels of voltage / power injected during conducted immunity measurements, modeling flow). Each tool and modeling method proposed is evaluated on different test cases. To assess our modeling approach, we finally apply it on a digital block of our test vehicle and compare simulation results to various internal and external measurements performed on the circuit
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography