To see the other types of publications on this topic, follow the link: Extreme value modeling.

Dissertations / Theses on the topic 'Extreme value modeling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Extreme value modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Shykhmanter, Dmytro. "Modeling Extreme Values." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-199737.

Full text
Abstract:
Modeling of extreme events is a challenging statistical task. Firstly, there is always a limit number of observations and secondly therefore no experience to back test the result. One way of estimating higher quantiles is to fit one of theoretical distributions to the data and extrapolate to the tail. The shortcoming of this approach is that the estimate of the tail is based on the observations in the center of distribution. Alternative approach to this problem is based on idea to split the data into two sub-populations and model body of the distribution separately from the tail. This methodology is applied to non-life insurance losses, where extremes are particularly important for risk management. Never the less, even this approach is not a conclusive solution of heavy tail modeling. In either case, estimated 99.5% percentiles have such high standard errors, that the their reliability is very low. On the other hand this approach is theoretically valid and deserves to be considered as one of the possible methods of extreme value analysis.
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Fan. "Hurricane Loss Modeling and Extreme Quantile Estimation." FIU Digital Commons, 2012. http://digitalcommons.fiu.edu/etd/557.

Full text
Abstract:
This thesis reviewed various heavy tailed distributions and Extreme Value Theory (EVT) to estimate the catastrophic losses simulated from Florida Public Hurricane Loss Projection Model (FPHLPM). We have compared risk measures such as Probable Maximum Loss (PML) and Tail Value at Risk (TVaR) of the selected distributions with empirical estimation to capture the characteristics of the loss data as well as its tail distribution. Generalized Pareto Distribution (GPD) is the main focus for modeling the tail losses in this application. We found that the hurricane loss data generated from FPHLPM were consistent with historical losses and were not as heavy as expected. The tail of the stochastic annual maximum losses can be explained by an exponential distribution. This thesis also touched on the philosophical implication of small probability, high impact events such as Black Swan and discussed the limitations of quantifying catastrophic losses for future inference using statistical methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Eriksson, Kristofer. "Risk Measures and Dependence Modeling in Financial Risk Management." Thesis, Umeå universitet, Institutionen för fysik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-85185.

Full text
Abstract:
In financial risk management it is essential to be able to model dependence in markets and portfolios in an accurate and efficient way. A high positive dependence between assets in a portfolio can be devastating, especially in times of crises, since losses will most likely occur at the same time in all assets for such a portfolio. The dependence is therefore directly linked to the risk of the portfolio. The risk can be estimated by several different risk measures, for example Value-at-Risk and Expected shortfall. This paper studies some different ways to measure risk and model dependence, both in a theoretical and empirical way. The main focus is on copulas, which is a way to model and construct complex dependencies. Copulas are a useful tool since it allows the user to separately specify the marginal distributions and then link them together with the copula. However, copulas can be quite complex to understand and it is not trivial to know which copula to use. An implemented copula model might give the user a "black-box" feeling and a severe model risk if the user trusts the model too much and is unaware of what is going. Another model would be to use the linear correlation which is also a way to measure dependence. This is an easier model and as such it is believed to be easier for all users to understand. However, linear correlation is only easy to understand in the case of elliptical distributions, and when we move away from this assumption (which is usually the case in financial data), some clear drawbacks and pitfalls become present. A third model, called historical simulation, uses the historical returns of the portfolio and estimate the risk on this data without making any parametric assumptions about the dependence. The dependence is assumed to be incorporated in the historical evolvement of the portfolio. This model is very easy and very popular, but it is more limited than the previous two models to the assumption that history will repeat itself and needs much more historical observations to yield good results. Here we face the risk that the market dynamics has changed when looking too far back in history. In this paper some different copula models are implemented and compared to the historical simulation approach by estimating risk with Value-at-Risk and Expected shortfall. The parameters of the copulas are also investigated under calm and stressed market periods. This information about the parameters is useful when performing stress tests. The empirical study indicates that it is difficult to distinguish the parameters between the stressed and calm market period. The overall conclusion is; which model to use depends on our beliefs about the future distribution. If we believe that the distribution is elliptical then a correlation model is good, if it is believed to have a complex dependence then the user should turn to a copula model, and if we can assume that history will repeat itself then historical simulation is advantageous.
APA, Harvard, Vancouver, ISO, and other styles
4

Hugueny, Samuel Y. "Novelty detection with extreme value theory in vital-sign monitoring." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:804a226c-a298-4764-9bc8-b191d2b852cd.

Full text
Abstract:
Every year in the UK, tens of thousands of hospital patients suffer adverse events, such as un-planned transfers to Intensive Therapy Units or unexpected cardiac arrests. Studies have shown that in a large majority of cases, significant physiological abnormalities can be observed within the 24-hour period preceding such events. Such warning signs may go unnoticed, if they occur between observations by the nursing staff, or are simply not identified as such. Timely detection of these warning signs and appropriate escalation schemes have been shown to improve both patient outcomes and the use of hospital resources, most notably by reducing patients’ length of stay. Automated real-time early-warning systems appear to be cost-efficient answers to the need for continuous vital-sign monitoring. Traditionally, a limitation of such systems has been their sensitivity to noisy and artefactual measurements, resulting in false-alert rates that made them unusable in practice, or earned them the mistrust of clinical staff. Tarassenko et al. (2005) and Hann (2008) proposed a novelty detection approach to the problem of continuous vital-sign monitoring, which, in a clinical trial, was shown to yield clinically acceptable false alert rates. In this approach, an observation is compared to a data fusion model, and its “normality” assessed by comparing a chosen statistic to a pre-set threshold. The method, while informed by large amounts of training data, has a number of heuristic aspects. This thesis proposes a principled approach to multivariate novelty detection in stochastic time- series, where novelty scores have a probabilistic interpretation, and are explicitly linked to the starting assumptions made. Our approach stems from the observation that novelty detection using complex multivariate, multimodal generative models is generally an ill-defined problem when attempted in the data space. In situations where “novel” is equivalent to “improbable with respect to a probability distribution ”, formulating the problem in a univariate probability space allows us to use classical results of univariate statistical theory. Specifically, we propose a multivariate extension to extreme value theory and, more generally, order statistics, suitable for performing novelty detection in time-series generated from a multivariate, possibly multimodal model. All the methods introduced in this thesis are applied to a vital-sign monitoring problem and compared to the existing method of choice. We show that it is possible to outperform the existing method while retaining a probabilistic interpretation. In addition to their application to novelty detection for vital-sign monitoring, contributions in this thesis to existing extreme value theory and order statistics are also valid in the broader context of data-modelling, and may be useful for analysing data from other complex systems.
APA, Harvard, Vancouver, ISO, and other styles
5

Schmiedt, Anja Bettina [Verfasser]. "Statistical modeling of non-metallic inclusions in steels and extreme value analysis / Anja Bettina Schmiedt." Aachen : Hochschulbibliothek der Rheinisch-Westfälischen Technischen Hochschule Aachen, 2013. http://d-nb.info/1047230615/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Paholok, Igor. "Power Markets and Risk Management Modeling." Doctoral thesis, Vysoká škola ekonomická v Praze, 2012. http://www.nusl.cz/ntk/nusl-191803.

Full text
Abstract:
The main target of this thesis is to summarize and explain the specifics of power markets and test application of models, which might be used especially in risk management area. Thesis starts with definition of market subjects, typology of traded contracts and description of market development with focus on Czech Republic. Thesis continues with development of theoretical concepts of short term/spot electricity markets and potential link between spot and forward electricity markets. After deriving of those microeconomic fundamental models we continue with stochastic models (Jump Diffusion Mean Reverting process and Extreme Value Theory) in order to depict patterns of spot and forward power contracts price volatility. Last chapter deals with credit risk specifics of power trading and develops model (using concept known as Credit Value Adjustment) to compare economic efficiency of OTC and exchange power trading. Developed and described models are tested on selected power markets, again with focus on Czech power market data set.
APA, Harvard, Vancouver, ISO, and other styles
7

Ayari, Samia. "Nonparametric estimation of the dependence function for multivariate extreme value distributions." Thesis, Aix-Marseille, 2016. http://www.theses.fr/2016AIXM4078.

Full text
Abstract:
Dans cette thèse, nous abordons l'estimation non paramétrique de la fonction de dépendance des distributions multivariées à valeurs extrêmes. Dans une première partie, on adopte l’hypothèse classique stipulant que les variables aléatoires sont indépendantes et identiquement distribuées (i.i.d). Plusieurs estimateurs non paramétriques sont comparés pour une fonction de dépendance trivariée de type logistique dans deux différents cas. Dans le premier cas, on suppose que les fonctions marginales sont des distributions généralisées à valeurs extrêmes. La distribution marginale est remplacée par la fonction de répartition empirique dans le deuxième cas. Les résultats des simulations Monte Carlo montrent que l'estimateur Gudendorf-Segers (Gudendorf et Segers, 2011) est plus efficient que les autres estimateurs pour différentes tailles de l’échantillon. Dans une deuxième partie, on ignore l’hypothèse i.i.d vue qu’elle n'est pas vérifiée dans l'analyse des séries temporelles. Dans le cadre univarié, on examine le comportement extrêmal d'un modèle autorégressif Gaussien stationnaire. Dans le cadre multivarié, on développe un nouveau théorème qui porte sur la convergence asymptotique de l'estimateur de Pickands vers la fonction de dépendance théorique. Ce fondement théorique est vérifié empiriquement dans les cas d’indépendance et de dépendance asymptotique. Dans la dernière partie de la thèse, l'estimateur Gudendorf-Segers est utilisé pour modéliser la structure de dépendance des concentrations extrêmes d’ozone observées dans les stations qui enregistrent des dépassements de la valeur guide et limite de la norme Tunisienne de la qualité d'air NT.106.04
In this thesis, we investigate the nonparametric estimation of the dependence function for multivariate extreme value distributions. Firstly, we assume independent and identically distributed random variables (i.i.d). Several nonparametric estimators are compared for a trivariate dependence function of logistic type in two different cases. In a first analysis, we suppose that marginal functions are generalized extreme value distributions. In a second investigation, we substitute the marginal function by the empirical distribution function. Monte Carlo simulations show that the Gudendorf-Segers (Gudendorf and Segers, 2011) estimator outperforms the other estimators for different sample sizes. Secondly, we drop the i.i.d assumption as it’s not verified in time series analysis. Considering the univariate framework, we examine the extremal behavior of a stationary Gaussian autoregressive process. In the multivariate setting, we prove the asymptotic consistency of the Pickands dependence function estimator. This theoretical finding is confirmed by empirical investigations in the asymptotic independence case as well as the asymptotic dependence case. Finally, the Gudendorf-Segers estimator is used to model the dependence structure of extreme ozone concentrations in locations that record several exceedances for both guideline and limit values of the Tunisian air quality standard NT.106.04
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Pu. "Modeling, analysis, and optimization for wireless networks in the presence of heavy tails." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/50232.

Full text
Abstract:
The heavy-tailed traffic from wireless users, caused by the emerging Internet and multimedia applications, induces extremely dynamic and variable network environment, which can fundamentally change the way in which wireless networks are conceived, designed, and operated. This thesis is concerned with modeling, analysis, and optimization of wireless networks in the presence of heavy tails. First, a novel traffic model is proposed, which captures the inherent relationship between the traffic dynamics and the joint effects of the mobility variability of network users and the spatial correlation in their observed physical phenomenon. Next, the asymptotic delay distribution of wireless users is analyzed under different traffic patterns and spectrum conditions, which reveals the critical conditions under which wireless users can experience heavy-tailed delay with significantly degraded QoS performance. Based on the delay analysis, the fundamental impact of heavy-tailed environment on network stability is studied. Specifically, a new network stability criterion, namely moment stability, is introduced to better characterize the QoS performance in the heavy-tailed environment. Accordingly, a throughput-optimal scheduling algorithm is proposed to maximize network throughput while guaranteeing moment stability. Furthermore, the impact of heavy-tailed spectrum on network connectivity is investigated. Towards this, the necessary conditions on the existence of delay-bounded connectivity are derived. To enhance network connectivity, the mobility-assisted data forwarding scheme is exploited, whose important design parameters, such as critical mobility radius, are derived. Moreover, the latency in wireless mobile networks is analyzed, which exhibits asymptotic linearity in the initial distance between mobile users.
APA, Harvard, Vancouver, ISO, and other styles
9

Luong, Thang Manh. "Severe Weather during the North American Monsoon and Its Response to Rapid Urbanization and a Changing Global Climate within the Context of High Resolution Regional Atmospheric Modeling." Diss., The University of Arizona, 2015. http://hdl.handle.net/10150/595660.

Full text
Abstract:
The North American monsoon (NAM) is the principal driver of summer severe weather in the Southwest U.S. With sufficient atmospheric instability and moisture, monsoon convection initiates during daytime in the mountains and later may organize, principally into mesoscale convective systems (MCSs). Most monsoon-related severe weather occurs in association with organized convection, including microbursts, dust storms, flash flooding and lightning. The overarching theme of this dissertation research is to investigate simulation of monsoon severe weather due to organized convection within the use of regional atmospheric modeling. A commonly used cumulus parameterization scheme has been modified to better account for dynamic pressure effects, resulting in an improved representation of a simulated MCS during the North American monsoon experiment and the climatology of warm season precipitation in a long-term regional climate model simulation. The effect of urbanization on organized convection occurring in Phoenix is evaluated in model sensitivity experiments using an urban canopy model (UCM) and urban land cover compared to pre-settlement natural desert land cover. The presence of vegetation and irrigation makes Phoenix a "heat sink" in comparison to its surrounding desert, and as a result the modeled precipitation in response to urbanization decreases within the Phoenix urban area and increase on its periphery. Finally, analysis of how monsoon severe weather is changing in association with observed global climate change is considered within the context of a series of retrospectively simulated severe weather events during the period 1948-2010 in a numerical weather prediction paradigm. The individual severe weather events are identified by favorable thermodynamic conditions of instability and atmospheric moisture (precipitable water). Changes in precipitation extremes are evaluated with extreme value statistics. During the last several decades, there has been intensification of organized convective precipitation, but these events occur with less frequency. A more favorable thermodynamic environment for monsoon thunderstorms is the driver of these changes, which is consistent with the broader notion that anthropogenic climate change is presently intensifying weather extremes worldwide.
APA, Harvard, Vancouver, ISO, and other styles
10

Wooten, Rebecca Dyanne. "Statistical environmental models : hurricanes, lightning, rainfall, floods, red tide and volcanoes." [Tampa, Fla] : University of South Florida, 2006. http://purl.fcla.edu/usf/dc/et/SFE0001824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Suzuki-Parker, Asuka. "An assessment of uncertainties and limitations in simulating tropical cyclone climatology and future changes." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/41062.

Full text
Abstract:
The recent elevated North Atlantic hurricane activity has generated considerable interests in the interaction between tropical cyclones (TCs) and climate change. The possible connection between TCs and the changing climate has been indicated by observational studies based on historical TC records; they indicate emerging trends in TC frequency and intensity in some TC basins, but the detection of trends has been hotly debated due to TC track data issues. Dynamical climate modeling has also been applied to the problem, but brings its own set of limitations owing to limited model resolution and uncertainties. The final goal of this study is to project the future changes of North Atlantic TC behavior with global warming for the next 50 years using the Nested Regional Climate Model (NRCM). Throughout the course of reaching this goal, various uncertainties and limitations in simulating TCs by the NRCM are identified and explored. First we examine the TC tracking algorithm to detect and track simulated TCs from model output. The criteria and thresholds used in the tracking algorithm control the simulated TC climatology, making it difficult to objectively assess the model's ability in simulating TC climatology. Existing tracking algorithms used by previous studies are surveyed and it is found that the criteria and thresholds are very diverse. Sensitivity of varying criteria and thresholds in TC tracking algorithm to simulated TC climatology is very high, especially with the intensity and duration thresholds. It is found that the commonly used criteria may not be strict enough to filter out intense extratropical systems and hybrid systems. We propose that a better distinction between TCs and other low-pressure systems can be achieved by adding the Cyclone Phase technique. Two sets of NRCM simulations are presented in this dissertation: One in the hindcasting mode, and the other with forcing from the Community Climate System Model (CCSM) to project into the future with global warming. Both of these simulations are assessed using the tracking algorithm with cyclone phase technique. The NRCM is run in a hindcasting mode for the global tropics in order to assess its ability to simulate the current observed TC climatology. It is found that the NRCM is capable of capturing the general spatial and temporal distributions of TCs, but tends to overproduce TCs particularly in the Northwest Pacific. The overpredction of TCs is associated with the overall convective tendency in the model added with an outstanding theory of wave energy accumulation leading to TC genesis. On the other hand, TC frequency in the tropical North Atlantic is under predicted due to the lack of moist African Easterly Waves. The importance of high-resolution is shown with the additional simulation with two-way nesting. The NRCM is then forced by the CCSM to project the future changes in North Atlantic TCs. An El Nino-like SST bias in the CCSM induced a high vertical wind shear in tropical North Atlantic, preventing TCs from forming in this region. A simple bias correction method is applied to remove this bias. The model projected an increase both in TC frequency and intensity owing to enhanced TC genesis in the main development region, where the model projects an increased favorability of large-scale environment for TC genesis. However, the model is not capable of explicitly simulating intense (Category 3-5) storms due to the limited model resolution. To extrapolate the prediction to intense storms, we propose a hybrid approach that combines the model results and a statistical modeling using extreme value theory. Specifically, the current observed TC intensity is statistically modeled with the General Pareto distribution, and the simulated intensity changes from the NRCM are applied to the statistical model to project the changes in intense storms. The results suggest that the occurrence of Category 5 storms may be increased by approximately 50% by 2055.
APA, Harvard, Vancouver, ISO, and other styles
12

Winter, Hugo. "Extreme value modelling of heatwaves." Thesis, Lancaster University, 2016. http://eprints.lancs.ac.uk/79961/.

Full text
Abstract:
Since the turn of the century record temperatures have been observed in at least 20 different countries across Europe. Isolated hot days are not often an issue; most devastation occurs when hot temperatures persist over many days. For example, the 2003 heatwave over Europe caused 40,000 deaths over a four week period at a cost of e 13.1 million to the agriculture sector. It is clear that accurate models for the risks associated with heatwaves are important to decision makers and planners who wish to reduce the number of people affected by these extreme events. Extreme value theory provides a statistical framework for modelling extreme events. Extreme value models for temperature data tend to focus solely on the intensity, overlooking how long periods of hot weather will last and what the spatial extent of the event will be. For heatwaves, it is vital to explicitly model extremal dependence in time and space. An aim of this thesis is to develop extreme value methods that can accurately capture the temporal evolution of heatwaves. Specifically, this is the first to use a broad class of asymptotically motivated dependence structures that can provide accurate inferences for different types of extremal dependence and over different orders of lagged dependence. This flexibility ensures that these models are less likely to dramatically under or over-estimate the risks of heatwave events. Climate change is now widely regarded as a driving force behind increased global temperatures. Extending the extreme value heatwave models to include covariate structure permits answers to critical questions such as: How will a 1°C warming in the global temperature increase the chance of a 2003 style event? The 2009 heatwave over Australia highlighted issues posed when multiple cities are affected simultaneously. Both Adelaide and Melbourne observed record temperatures during the same event which led to 374 deaths and 2000 people being treated for heat related illness. It is not enough for heatwave models to account for temporal dependence, they also need to explicitly model spatial dependence. Large-scale climatic phenomena such as the El Nino-Southern Oscillation are known to affect the temperatures across Australia. This thesis develops new spatial extreme value methods that account for covariates, which are shown to model the 2009 event well. A novel suite of spatial and temporal risk measures is designed to better understand whether these covariates have an effect on the spatial extent and duration of heatwaves. This provides important information for decision makers that is not available using current methodology.
APA, Harvard, Vancouver, ISO, and other styles
13

Adam, Mohd Bakri. "Extreme value modelling of sports data." Thesis, Lancaster University, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.444854.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Allen, David W. "Software for Manipulating and Embedding Data Interrogation Algorithms Into Integrated Systems." Thesis, Virginia Tech, 2004. http://hdl.handle.net/10919/35117.

Full text
Abstract:
In this study a software package for easily creating and embedding structural health monitoring (SHM) data interrogation processes in remote hardware is presented. The software described herein is comprised of two pieces. The first is a client to allow graphical construction of data interrogation processes. The second is node software for remote execution of processes on remote sensing and monitoring hardware. The client software is created around a catalog of data interrogation algorithms compiled over several years of research at Los Alamos National Laboratory known as DIAMOND II. This study also includes encapsulating the DIAMOND II algorithms into independent interchangeable functions and expanding the catalog with work in feature extraction and statistical discrimination. The client software also includes methods for interfacing with the node software over an Internet connection. Once connected, the client software can upload a developed process to the integrated sensing and processing node. The node software has the ability to run the processes and return results. This software creates a distributed SHM network without individual nodes relying on each other or a centralized server to monitor a structure. For the demonstration summarized in this study, the client software is used to create data collection, feature extraction, and statistical modeling processes. Data are collected from monitoring hardware connected to the client by a local area network. A structural health monitoring process is created on the client and uploaded to the node software residing on the monitoring hardware. The node software runs the process and monitors a test structure for induced damage, returning the current structural-state indicator in near real time to the client. Current integrated health monitoring systems rely on processes statically loaded onto the monitoring node before the node is deployed in the field. The primary new contribution of this study is a software paradigm that allows processes to be created remotely and uploaded to the node in a dynamic fashion over the life of the monitoring node without taking the node out of service.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
15

Han, Zhongxian. "Actuarial modelling of extremal events using transformed generalized extreme value distributions and generalized pareto distributions." Columbus, Ohio : Ohio State University, 2003. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1061227080.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2003.
Title from first page of PDF file. Document formatted into pages; contains x, 81 p.; also includes graphics (some col.). Includes abstract and vita. Advisor: Bostwick Wyman, Dept. of Mathematics. Includes bibliographical references (p. 80-81).
APA, Harvard, Vancouver, ISO, and other styles
16

Youngman, Ben. "Space-time modelling of extreme values." Thesis, University of Sheffield, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.555232.

Full text
Abstract:
The motivation for the work in this thesis is the study of models for extreme values that have clear practical benefit. Specific emphasis is placed on the modelling of extremes of environmental phenomena, which often exhibit spatial or temporal dependence, or both, or are forced by external factors. The peaks-over-threshold approach to modelling extremes combats temporal dependence, providing a way in which likelihood-based methods may he used reliably. The method is widely used and has sound asymptotic foundations. However its performance in practical situations is less well understand. The essence of the method is to identify clusters of extremes and estimate the required extremal properties based only on the cluster peaks. A simulation study is used here to assess the performance of the method. This study shows that while not robust to some of its arbitrary choices, such as cluster identification procedure, if clusters are identified using Ferro and Segers' (2003) automatic procedure then the peaks-over-threshold method typically gives accurate estimates of extremal properties. It is often common for extreme values to be affected by external factors. For example environmental extremes may be expected to behave differently at different times of the year. Incorporating beliefs about external factors was recognised early on in the development of extremal models as an important consideration, and a simple way in which this can be achieved is by allowing parameters of extremal distributions to depend on covariates. The work here considers whether choosing logical covariate forms for variation in parameters leads to improved estimation of extremal properties. It is found that a degree of improved accuracy in estimates can be achieved upon choice of a suitable model. but that the uncertainty in estimates, which is important to report, is poorly quantified.
APA, Harvard, Vancouver, ISO, and other styles
17

Wong, Siu-tung, and 王兆東. "On some issues in the modelling of extreme observations." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B4218258X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Wong, Siu-tung. "On some issues in the modelling of extreme observations." Click to view the E-thesis via HKUTO, 2009. http://sunzi.lib.hku.hk/hkuto/record/B4218258X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Dalne, Katja. "The Performance of Market Risk Models for Value at Risk and Expected Shortfall Backtesting : In the Light of the Fundamental Review of the Trading Book." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-206168.

Full text
Abstract:
The global financial crisis that took off in 2007 gave rise to several adjustments of the risk regulation for banks. An extensive adjustment, that is to be implemented in 2019, is the Fundamental Review of the Trading Book (FRTB). It proposes to use Expected Shortfall (ES) as risk measure instead of the currently used Value at Risk (VaR), as well as applying varying liquidity horizons based on the various risk levels of the assets involved. A major difficulty of implementing the FRTB lies within the backtesting of ES. Righi and Ceretta proposes a robust ES backtest based on Monte Carlo simulation. It is flexible since it does not assume any probability distribution and can be performed without waiting for an entire backtesting period. Implementing some commonly used VaR backtests as well as the ES backtest by Righi and Ceretta, yield a perception of which risk models that are the most accurate from both a VaR and an ES backtesting perspective. It can be concluded that a model that is satisfactory from a VaR backtesting perspective does not necessarily remain so from an ES backtesting perspective and vice versa. Overall, the models that are satisfactory from a VaR backtesting perspective turn out to be probably too conservative from an ES backtesting perspective. Considering the confidence levels proposed by the FRTB, from a VaR backtesting perspective, a risk measure model with a normal copula and a hybrid distribution with the generalized Pareto distribution in the tails and the empirical distribution in the center along with GARCH filtration is the most accurate one, as from an ES backtesting perspective a risk measure model with univariate Student’s t distribution with ⱱ ≈ 7 together with GARCH filtration is the most accurate one for implementation. Thus, when implementing the FRTB, the bank will need to compromise between obtaining a good VaR model, potentially resulting in conservative ES estimates, and obtaining a less satisfactory VaR model, possibly resulting in more accurate ES estimates. The thesis was performed at SAS Institute, an American IT company that develops software for risk management among others. Targeted customers are banks and other financial institutions. Investigating the FRTB acts a potential advantage for the company when approaching customers that are to implement the regulation framework in a near future.
Den globala finanskrisen som inleddes år 2007 ledde till flertalet ändringar vad gäller riskreglering för banker. En omfattande förändring som beräknas implementeras år 2019, utgörs av Fundamental Review of the Trading Book (FRTB). Denna föreslår bland annat användande av Expected Shortfall (ES) som riskmått istället för Value at Risk (VaR) som används idag, liksom tillämpandet av varierande likviditetshorisonter beroende på risknivåerna för tillgångarna i fråga. Den huvudsakliga svårigheten med att implementera FRTB ligger i backtestingen av ES. Righi och Ceretta föreslår ett robust ES backtest som baserar sig på Monte Carlo-simulering. Det är flexibelt i den mening att det inte antar någon specifik sannolikhetsfördelning samt att det går att implementera utan att man behöver vänta en hel backtestingperiod. Vid implementation av olika standardbacktest för VaR, liksom backtestet för ES av Righi och Ceretta, fås en uppfattning av vilka riskmåttsmodeller som ger de mest korrekta resultaten från både ett VaR- och ES-backtestingperspektiv. Sammanfattningsvis kan man konstatera att en modell som är acceptabel från ett VaR-backtestingperspektiv inte nödvändigtvis är det från ett ES-backtestingperspektiv och vice versa. I det hela taget har det visat sig att de modeller som är acceptabla ur ett VaR-backtestingperspektiv troligtvis är för konservativa från ett ESbacktestingperspektiv. Om man betraktar de konfidensnivåer som föreslagits i FRTB, kan man ur ett VaR-backtestingperspektiv konstatera att en riskmåttsmodell med normal-copula och en hybridfördelning med generaliserad Pareto-fördelning i svansarna och empirisk fördelning i centrum tillsammans med GARCH-filtrering är den bäst lämpade, medan det från ett ES-backtestingperspektiv är att föredra en riskmåttsmodell med univariat Student t-fördelning med ⱱ ≈ 7 tillsammans med GARCH-filtrering. Detta innebär att när banker ska implementera FRTB kommer de behöva kompromissa mellan att uppnå en bra VaR-modell som potentiellt resulterar i för konservativa ES-estimat och en modell som är mindre bra ur ett VaRperspektiv men som resulterar i rimligare ES-estimat. Examensarbetet genomfördes vid SAS Institute, ett amerikanskt IT-företag som bland annat utvecklar mjukvara för riskhantering. Tänkbara kunder är banker och andra finansinstitut. Denna studie av FRTB innebär en potentiell fördel för företaget vid kontakt med kunder som planerar implementera regelverket inom en snar framtid.
Riskhantering, finansiella tidsserier, Value at Risk, Expected Shortfall, Monte Carlo-simulering, GARCH-modellering, Copulas, hybrida distributioner, generaliserad Pareto-fördelning, extremvärdesteori, Backtesting, likviditetshorisonter, Basels regelverk
APA, Harvard, Vancouver, ISO, and other styles
20

Ben, Abdallah Nadia. "Modeling sea-level rise uncertainties for coastal defence adaptation using belief functions." Thesis, Compiègne, 2014. http://www.theses.fr/2014COMP1616.

Full text
Abstract:
L’adaptation côtière est un impératif pour faire face à l’élévation du niveau marin,conséquence directe du réchauffement climatique. Cependant, la mise en place d’actions et de stratégies est souvent entravée par la présence de diverses et importantes incertitudes lors de l’estimation des aléas et risques futurs. Ces incertitudes peuvent être dues à une connaissance limitée (de l’élévation du niveau marin futur par exemple) ou à la variabilité naturelle de certaines variables (les conditions de mer extrêmes). La prise en compte des incertitudes dans la chaîne d’évaluation des risques est essentielle pour une adaptation efficace.L’objectif de ce travail est de proposer une méthodologie pour la quantification des incertitudes basée sur les fonctions de croyance – un formalisme de l’incertain plus flexible que les probabilités. Les fonctions de croyance nous permettent de décrire plus fidèlement l’information incomplète fournie par des experts (quantiles,intervalles, etc.), et de combiner différentes sources d’information. L’information statistique peut quand à elle être décrite par de fonctions des croyance définies à partir de la fonction de vraisemblance. Pour la propagation d’incertitudes, nous exploitons l’équivalence mathématique entre fonctions de croyance et intervalles aléatoires, et procédons par échantillonnage Monte Carlo. La méthodologie est appliquée dans l’estimation des projections de la remontée du niveau marin global à la fin du siècle issues de la modélisation physique, d’élicitation d’avis d’experts, et de modèle semi-empirique. Ensuite, dans une étude de cas, nous évaluons l’impact du changement climatique sur les conditions de mers extrêmes et évaluons le renforcement nécessaire d’une structure afin de maintenir son niveau de performance fonctionnelle
Coastal adaptation is an imperative to deal with the elevation of the global sealevel caused by the ongoing global warming. However, when defining adaptationactions, coastal engineers encounter substantial uncertainties in the assessment of future hazards and risks. These uncertainties may stem from a limited knowledge (e.g., about the magnitude of the future sea-level rise) or from the natural variabilityof some quantities (e.g., extreme sea conditions). A proper consideration of these uncertainties is of principal concern for efficient design and adaptation.The objective of this work is to propose a methodology for uncertainty analysis based on the theory of belief functions – an uncertainty formalism that offers greater features to handle both aleatory and epistemic uncertainties than probabilities.In particular, it allows to represent more faithfully experts’ incomplete knowledge (quantiles, intervals, etc.) and to combine multi-sources evidence taking into account their dependences and reliabilities. Statistical evidence can be modeledby like lihood-based belief functions, which are simply the translation of some inference principles in evidential terms. By exploiting the mathematical equivalence between belief functions and random intervals, uncertainty can be propagated through models by Monte Carlo simulations. We use this method to quantify uncertainty in future projections of the elevation of the global sea level by 2100 and evaluate its impact on some coastal risk indicators used in coastal design. Sea-level rise projections are derived from physical modelling, expert elicitation, and historical sea-level measurements. Then, within a methodologically-oriented case study,we assess the impact of climate change on extreme sea conditions and evaluate there inforcement of a typical coastal defence asset so that its functional performance is maintained
APA, Harvard, Vancouver, ISO, and other styles
21

MacDonald, Anna Elizabeth. "Extreme value mixture modelling with medical and industrial applications." Thesis, University of Canterbury. Mathematics and Statistics, 2011. http://hdl.handle.net/10092/6679.

Full text
Abstract:
Extreme value models are typically used to describe the distribution of rare events. Generally, an asymptotically motivated extreme value model is used to approximate the tail of some population distribution. One of the key challenges, with even the simplest application of extreme value models, is to determine the “threshold” above which (if interested in the upper tail), the asymptotically motivated model provides a reliable approximation to the tail of the population distribution. The threshold choice is essentially a balance between the usual bias versus variance tradeoff. Practitioners should choose as high a threshold as possible,such that the asymptotic approximation is reliable, i.e. little bias, but not so high that there is insufficient data to reliably estimate the model parameters, i.e. increasing variance. Traditionally, graphical diagnostics evaluating various properties of the model fit have been used to determine the threshold. Once chosen via these diagnostics, the threshold is treated as a fixed quantity, hence the uncertainty associated with its estimation is not accounted for. A plethora of recent articles have proposed various extreme value mixture models for threshold estimation and quantifying the corresponding uncertainty. Further, the subjectivity of threshold estimation is removed as the mixture models typically treat the threshold as a parameter, so it can be objectively estimated using standard inference tools, avoiding the aforementioned graphical diagnostics. These mixture models are typically easy to automate for application to multiple data sets, or in forecasting situations, for which various adhoc adaptations have had to be made in the past to overcome the threshold estimation problem. The drawback with most of the mixture models currently in the literature is the prior specification of a parametric model for the bulk of the distribution, which can be sensitive to model misspecification. In particular, misspecification of the bulk model’s lower tail behaviour can have a large impact on the bulk fit and therefore on the upper tail fit, which is a serious concern. Non-parametric and semi-parametric alternatives have very recently been proposed, but these tend to suffer from complicated computational aspects in the inference or challenges with interpretation of the final estimated tail behaviour. This thesis focusses on developing a flexible extremal mixture model which splices together the usual extreme value model for the upper tail behavior, with the threshold as a parameter, and the “bulk” of the distribution below the threshold captured by a non-parametric kernel density estimator. This representation avoids the need to specify a-priori a particular parametric model for the bulk distribution, and only really requires the trivial assumption of a smooth density which is realistic in most applications. This model overcomes sensitivity to the specification of the bulk distribution (and in particular its lower tail). Inference for all the parameters, including the threshold and the kernel bandwidth, is carried out in a Bayesian paradigm, potentially allowing sources of expert information to be included, which can help with the inherent sparsity of extremal sample information. A simulation study is used to demonstrate the performance of the proposed mixture model. A known problem with kernel density estimators used in the original extremal mixture model proposed, is that they suffer from edge effects if the (lower) tail does not decay away to zero at the boundary. Various adaptations have been proposed in the nonparametric density estimation literature, which have been used within this thesis to extend the extreme value mixture model to overcome this issue, i.e. producing a boundary corrected kernel density estimator for the bulk distribution component of my extremal mixture model. An alternative approach of replacing both the upper and lower tails by extremal tail models is also shown to resolve the boundary correction issue, and also have the secondary benefits of: • robustness of standard kernel bandwidth estimators against outliers in the tail; • consistent estimator of the bandwidth for heavy tailed populations. This research further extends the novel mixture model to describe non-stationary features. Extension of the other mixture models seen in the literature to model non-stationarity appears rather complex, as they require specification of not only how the usual threshold and point process parameters vary over time or space but also those of the bulk distribution component of the models. The benefit of this particular mixture model is that the nonstationarity in the threshold and point process parameters can be modeled in the usual way(s), with the only other parameter being the kernel bandwidth where it is safe in most applications to assume that it does not vary or will typically vary very slowly. The non-stationary mixture also automatically accounts for the uncertainty associated with estimation of the parameters of the time-varying threshold, which no other non-stationary extremal model in the literature has achieved thus far. Results from simulations and an application using Bayesian inference are given to assess the performance of the model. Further, a goal of this research is to contribute to the refinement of our understanding of “normal ranges” for high frequency physiological measurements from pre-term babies. Clinicians take various physiological measurements from premature babies in neonatal intensive care units (NICUs) for assessing the condition of the neonate. These measurements include oxygen saturation, pulse rates and respiration rates. It is known that there are deficiencies in our knowledge of “normal ranges”, hence refinement of ranges essentially requires reliable estimation of relatively high quantiles (e.g. 95% or 99%). Models proposed within this thesis are applied to pulse rates and/or oxygen saturation levels of neonates in Christchurch Women’s Hospital, New Zealand. A further application of the stationary extremal mixture model is for assessing the risk of certain temperature levels with cores of Magnox nuclear reactors, combining predictions from a detailed statistical model for temperature prediction and extremal modelling of the residuals for assessing the remaining uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
22

Zhao, Xin. "Extreme value modelling with application in finance and neonatal research." Thesis, University of Canterbury. Mathematics and Statistics, 2010. http://hdl.handle.net/10092/4024.

Full text
Abstract:
Modelling the tails of distributions is important in many fields, such as environmental science, hydrology, insurance, engineering and finance, where the risk of unusually large or small events are of interest. This thesis applies extreme value models in neonatal and finance studies and develops novel extreme value modelling for financial applications, to overcome issues associated with the dependence induced by volatility clustering and threshold choice. The instability of preterm infants stimulates the interests in estimating the underlying variability of the physiology measurements typically taken on neonatal intensive care patients. The stochastic volatility model (SVM), fitted using Bayesian inference and a particle filter to capture the on-line latent volatility of oxygen concentration, is used in estimating the variability of medical measurements of preterm infants to highlight instabilities resulting from their under-developed biological systems. Alternative volatility estimators are considered to evaluate the performance of the SVM estimates, the results of which suggest that the stochastic volatility model provides a good estimator of the variability of the oxygen concentration data and therefore may be used to estimate the instantaneous latent volatility for the physiological measurements of preterm infants. The classical extreme value distribution, generalized pareto distribution (GPD), with the peaks-over-threshold (POT) method to ameliorate the impact of dependence in the extremes to infer the extreme quantile of the SVM based variability estimates. Financial returns typically show clusters of observations in the tails, often termed “volatility clustering” which creates challenges when applying extreme value models, since classical extreme value theory assume independence of underlying process. Explicit modelling on GARCH-type dependence behaviour of extremes is developed by implementing GARCH conditional variance structure via the extreme value model parameters. With the combination of GEV and GARCH models, both simulation and empirical results show that the combined model is better suited to explain the extreme quantiles. Another important benefit of the proposed model is that, as a one stage model, it is advantageous in making inferences and accounting for all uncertainties much easier than the traditional two stage approach for capturing this dependence. To tackle the challenge threshold choice in extreme value modelling and the generally asymmetric distribution of financial data, a two tail GPD mixture model is proposed with Bayesian inference to capture both upper and lower tail behaviours simultaneously. The proposed two tail GPD mixture modelling approach can estimate both thresholds, along with other model parameters, and can therefore account for the uncertainty associated with the threshold choice in latter inferences. The two tail GPD mixture model provides a very flexible model for capturing all forms of tail behaviour, potentially allowing for asymmetry in the distribution of two tails, and is demonstrated to be more applicable in financial applications than the one tail GPD mixture models previously proposed in the literature. A new Value-at-Risk (VaR) estimation method is then constructed by adopting the proposed mixture model and two-stage method: where volatility estimation using a latent volatility model (or realized volatility) followed by the two tail GPD mixture model applied to independent innovations to overcome the key issues of dependence, and to account for the uncertainty associated with threshold choice. The proposed method is applied in forecasting VaR for empirical return data during the current financial crisis period.
APA, Harvard, Vancouver, ISO, and other styles
23

Navarrete, Miguel A. Ancona. "Dependence modelling and spatial prediction for extreme values." Thesis, Lancaster University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.369658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Gyarmati-Szabo, Janos. "Statistical extreme value modelling to study roadside air pollution episodes." Thesis, University of Leeds, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.551267.

Full text
Abstract:
Motivated by the potential danger of high air pollution concentrations (episodes) on human health and the environment, the overall aim of this thesis is to gain a greater understanding of and insight into the formation of such episodic conditions via proposing new extreme value statistical models. The modelling and prediction of air pollution episodes' occurrence, strength and dur~tion are formidable problems in the urban atmospheric media due to the combination of many complex simultaneously working physical and chemical processes involved in their formations. It has been long observed that conventional statistical methods may not be suitable for solving these problems, thus initiating the application of more flexible approaches. In the last couple of decades Extreme Value Theory (EVT) has been widely used with great success to overcome some of the aforementioned issues. However, even the most recent EVT models cannot deal with all the aspects of these problems. The objective of this research is to specify the requirements of new extreme value models by taking into account the demerits of the old ones, to develop such new models and validate their adequacy on real datasets. To place this research in relation to the wide-ranging existing literature and to identify the model requirements, a comprehensive review on EVT and its applications in air pollution modelling has been conducted. Based on the gaps identified in the literature, four extreme value models are proposed in the Peaks over Threshold context, which are either improvements on existing models or completely new ones involving new theoretical results in the background. Based on these models, and their possible amalgamations, the occurrence times, the strengths and the durations of episodes can be modelled and predicted. The relationship between these characteristics and meteorological as well as traffic conditions are identified, which are considered as the most significant contributors to these events.
APA, Harvard, Vancouver, ISO, and other styles
25

Eljabri, Sumaya Saleh M. "New statistical models for extreme values." Thesis, University of Manchester, 2013. https://www.research.manchester.ac.uk/portal/en/theses/new-statistical-models-for-extreme-values(12e1ec08-dc66-4f20-a7dc-c89be62421a0).html.

Full text
Abstract:
Extreme value theory (EVT) has wide applicability in several areas like hydrology, engineering, science and finance. Across the world, we can see the disruptive effects of flooding, due to heavy rains or storms. Many countries in the world are suffering from natural disasters like heavy rains, storms, floods, and also higher temperatures leading to desertification. One of the best known extraordinary natural disasters is the 1931 Huang He flood, which led to around 4 millions deaths in China; these were a series of floods between Jul and Nov in 1931 in the Huang He river.Several publications are focused on how to find the best model for these events, and to predict the behaviour of these events. Normal, log-normal, Gumbel, Weibull, Pearson type, 4-parameter Kappa, Wakeby and GEV distributions are presented as statistical models for extreme events. However, GEV and GP distributions seem to be the most widely used models for extreme events. In spite of that, these models have been misused as models for extreme values in many areas.The aim of this dissertation is to create new modifications of univariate extreme value models.The modifications developed in this dissertation are divided into two parts: in the first part, we make generalisations of GEV and GP, referred to as the Kumaraswamy GEV and Kumaraswamy GP distributions. The major benefit of these models is their ability to fit the skewed data better than other models. The other idea in this study comes from Chen, which is presented in Proceedings of the International Conference on Computational Intelligence and Software Engineering, pp. 1-4. However, the cumulative and probability density functions for this distribution do not appear to be valid functions. The correction of this model is presented in chapter 6.The major problem in extreme event models is the ability of the model to fit tails of data. In chapter 7, the idea of the Chen model with the correction is combined with the GEV distribution to introduce a new model for extreme values referred to as new extreme value (NEV) distribution. It seems to be more flexible than the GEV distribution.
APA, Harvard, Vancouver, ISO, and other styles
26

Wyncoll, David Peter. "State space modelling of extreme values with particle filters." Thesis, Lancaster University, 2009. http://eprints.lancs.ac.uk/31479/.

Full text
Abstract:
State space models are a flexible class of Bayesian model that can be used to smoothly capture non-stationarity. Observations are assumed independent given a latent state process so that their distribution can change gradually over time. Sequential Monte Carlo methods known as particle filters provide an approach to inference for such models whereby observations are added to the fit sequentially. Though originally developed for on-line inference, particle filters, along with related particle smoothers, often provide the best approach for off-line inference. This thesis develops new results for particle filtering and in particular develops a new particle smoother that has a computational complexity that is linear in the number of Monte Carlo samples. This compares favourably with the quadratic complexity of most of its competitors resulting in greater accuracy within a given time frame. The statistical analysis of extremes is important in many fields where the largest or smallest values have the biggest effect. Accurate assessments of the likelihood of extreme events are crucial to judging how severe they could be. While the extreme values of a stationary time series are well understood, datasets of extremes often contain varying degrees of non-stationarity. How best to extend standard extreme value models to account for non-stationary series is a topic of ongoing research. The thesis develops inference methods for extreme values of univariate and multivariate non-stationary processes using state space models fitted using particle methods. Though this approach has been considered previously in the univariate case, we identify problems with the existing method and provide solutions and extensions to it. The application of the methodology is illustrated through the analysis of a series of world class athletics running times, extreme temperatures at a site in the Antarctic, and sea-level extremes on the east coast of England.
APA, Harvard, Vancouver, ISO, and other styles
27

Ramos, Alexandra. "Multivariate joint tail modelling and score tests of independence." Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/843207/.

Full text
Abstract:
Probabilistic and statistical aspects of extremes of univariate processes have been extensively studied, and recent developments in extremes have focused on multivariate theory and its application. Multivariate extreme value theory encompasses two separate aspects: marginal features, which may be handled by standard univariate methods, and dependence features. Both will be examined in this study. First we focus on testing independence in multivariate extremes. All existing score tests of independence in multivariate extreme values have non-regular properties that arise due to violations of the usual regularity conditions of maximum likelihood. Some of these violations may be dealt with using standard techniques, for example when independence corresponds to a boundary point of the parameter space of the underlying model. However, another type of regularity violation, the infinite second moment of the score function, is more difficult to deal with and has important consequences for applications, resulting in score statistics with non-standard normalisation and poor rates of convergence. We propose a likelihood based approach that provides asymptotically normal score tests of independence with regular normalisation and rapid convergence. The resulting tests are straightforward to implement and are beneficial in practical situations with realistic amounts of data. A fundamental issue in applied multivariate extreme value (MEV) analysis is modelling dependence within joint tail regions. The primary aim of the remainder of this thesis is to develop a pseudo-polar framework for modelling extremal dependence that extends the existing classical results for multivariate extremes to encompass asymptotically independent tails. Accordingly, a constructional procedure for obtaining parametric asymptotically independent joint tail models is developed. The practical application of this framework is analysed through applications to bivariate simulated and environmental data, and joint estimation of dependence and marginal parameters via likelihood methodology is detailed. Inference under our models is examined and tests of extremal asymptotic independence and asymmetry are derived which are useful for model selection. In contrast to the classical MEV approach, which concentrates on the distribution of the normalised componentwise maxima, our framework is based on modelling joint tails and focuses directly on the tail structure of the joint survivor function. Consequently, this framework provides significant extensions of both the theoretical and applicable tools of joint tail modelling. Analogous point process theory is developed and the classical componentwise maxima result for multivariate extremes is extended to the asymptotically independent case. Finally, methods for simulating from two of our bivariate parametric models are provided.
APA, Harvard, Vancouver, ISO, and other styles
28

Rivera, Mancía María Elena. "Modelling operational risk using a Bayesian approach to extreme value theory." Thesis, McGill University, 2014. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=123216.

Full text
Abstract:
Extreme-value theory is concerned with the tail behaviour of probability distributions. In recent years, it has found many applications in areas as diverse as hydrology, actuarial science, and finance, where complex phenomena must often be modelled from a small number of observations.Extreme-value theory can be used to assess the risk of rare events either through the block maxima or peaks-over-threshold method. The choice of threshold is both influential and delicate, as a balance between the bias and variance of the estimates is required. At present, this threshold is often chosen arbitrarily, either graphically or by setting it as some high quantile of the data.Bayesian inference is an alternative to deal with this problem by treating the threshold as a parameter in the model. In addition, a Bayesian approach allows for the incorporation of internal and external observations in combination with expert opinion, thereby providing a natural probabilistic framework to evaluate risk models.This thesis presents a Bayesian inference framework for extremes. We focus on a model proposed by Behrens et al. (2004), where an analysis of extremes is performed using a mixture model that combines a parametric form for the centre and a Generalized Pareto Distribution (GPD) for the tail of the distribution. Our approach accounts for all the information available in making inference about the unknown parameters from both distributions, the threshold included. A Bayesian analysis is then performed by using expert opinions to determine the parameters for prior distributions; posterior inference is carried out through Markov Chain Monte Carlo methods. We apply this methodology to operational risk data to analyze its performance.The contributions of this thesis can be outlined as follows:-Bayesian models have been barely explored in operational risk analysis. In Chapter 3, we show how these models can be adapted to operational risk analysis using fraud data collected by different banks between 2007 and 2010. By combining prior information to the data, we can estimate the minimum capital requirement and risk measures such as the Value-at-Risk (VaR) and the Expected Shortfall (ES) for each bank.-The use of expert opinion plays a fundamental role in operational risk modelling. However, most of time this issue is not addressed properly. In Chapter 4, we consider the context of the problem and show how to construct a prior distribution based on measures that experts are familiar with, including VaR and ES. The purpose is to facilitate prior elicitation and reproduce expert judgement faithfully.-In Section 4.3, we describe techniques for the combination of expert opinions. While this issue has been addressed in other fields, it is relatively recent in our context. We examine how different expert opinions may influence the posterior distribution and how to build a prior distribution in this case. Results are presented on simulated and real data.-In Chapter 5, we propose several new mixture models with Gamma and Generalized Pareto elements. Our models improve upon previous work by Behrens et al. (2004) since the loss distribution is either continuous at a fixed quantile or it has continuous first derivative at the blend point. We also consider the cases when the scaling is arbitrary and when the density is discontinuous.-Finally, we introduce two nonparametric models. The first one is based on the fact that the GPD model can be represented as a Gamma mixture of exponential distributions, while the second uses a Dirichlet process prior on the parameters of the GPD model.
La théorie des valeurs extrêmes concerne l'étude du comportement caudal de lois de probabilité. Ces dernières années, elle a trouvé de nombreuses applications dans des domaines aussi variés que l'hydrologie, l'actuariat et la finance, où l'on doit parfois modéliser des phénomènes complexes à partir d'un petit nombre d'observations.La théorie des valeurs extrêmes permet d'évaluer le risque d'événements rares par la méthode des maxima bloc par bloc ou celle des excès au-delà d'un seuil. Le choix du seuil est à la fois influent et délicat, vu la nécessité de trouver un équilibre entre le biais et la précision des estimations. À l'heure actuelle, ce seuil est souvent choisi arbitrairement, soit à partir d'un graphique ou d'un quantile élevé des données.L'inférence bayésienne permet de contourner cette difficulté en traitant le seuil comme un paramètre du modèle. L'approche bayésienne permet en outre d'incorporer des observations internes et externes en lien avec l'opinion d'experts, fournissant ainsi un cadre probabiliste naturel pour l'évaluation des modèles de risque.Cette thèse décrit un cadre d'inférence bayésien pour les extrêmes. Ce cadre est inspiré des travaux de Behrens et coll. (2004), dans lesquels l'étude des extrêmes est réalisée au moyen d'un modèle de mélange alliant une forme paramétrique pour le cœur de la distribution et une loi de Pareto généralisée (LPG) pour sa queue. L'approche proposée exploite toute l'information disponible pour le choix des paramètres des deux lois, y compris le seuil. Une analyse bayésienne tenant compte d'avis d'experts sur les paramètres des lois a priori est ensuite effectué; l'inférence a posteriori s'appuie sur une chaîne de Markov Monte-Carlo. Nous appliquons cette approche à des données relatives aux risqué opérationnels afin d'analyser sa performance.Les principales contributions de cette thèse sont les suivantes :-On fait rarement appel aux modèles bayésiens pour l'analyse du risque opérationnel. Au chapitre 3, nous montrons comment adapter ces modèles à l'analyse du risqué opérationnel au moyen de statistiques de fraudes recueillies par des banques entre 2007 et 2010. L'intégration d'information a priori aux données nous permet d'estimer le capital minimal requis pour chaque banque, ainsi que diverses mesures de risque telles que la valeur à-risque (VaR) et le déficit prévu (DP).-Les avis d'experts jouent un rôle clef dans la modélisation du risque opérationnel. Toutefois, cette question est souvent traitée de façon incorrecte. Au chapitre 4, nous examinons le problème dans son contexte et montrons comment choisir une loi a priori à partir de mesures que les experts connaissent bien, dont la VaR et le DP. Le but est de faciliter le choix de la loi a priori et de mieux refléter l'avis des experts.-À la section 4.3, nous décrivons diverses techniques de synthèse d'opinions d'experts. Bien que ce problème ait déjà été abordé dans d'autres domaines, il est relativement nouveau dans notre contexte. Nous montrons comment élaborer une loi a priori à partir d'avis d'experts et mesurons leur influence sur la loi a posteriori. Des données réelles et simulées sont utilisées aux fins d'illustration.-Au chapitre 5, nous proposons plusieurs nouveaux modèles faisant intervenir des mélanges de lois gamma et de Pareto généralisées. Ces modèles étendent les travaux de Behrens et coll. (2004) dans la mesure où la loi des pertes peut être continue à un quantile donné ou avoir une première dérivée continue au point de jonction. Nous traitons aussi les cas o ù l'échelle est arbitraire et la densité est discontinue.-Enfin, nous présentons deux modèles non paramétriques. Le premier s'appuie sur le fait que le modèle LPG peut être représenté comme un mélange gamma de lois exponentielles; dans le second, l'information a priori sur les paramètres du modèle LPG est représentée par un processus de Dirichlet.
APA, Harvard, Vancouver, ISO, and other styles
29

Hitz, Adrien. "Modelling of extremes." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:ad32f298-b140-4aae-b50e-931259714085.

Full text
Abstract:
This work focuses on statistical methods to understand how frequently rare events occur and what the magnitude of extreme values such as large losses is. It lies in a field called extreme value analysis whose scope is to provide support for scientific decision making when extreme observations are of particular importance such as in environmental applications, insurance and finance. In the univariate case, I propose new techniques to model tails of discrete distributions and illustrate them in an application on word frequency and multiple birth data. Suitably rescaled, the limiting tails of some discrete distributions are shown to converge to a discrete generalized Pareto distribution and generalized Zipf distribution respectively. In the multivariate high-dimensional case, I suggest modeling tail dependence between random variables by a graph such that its nodes correspond to the variables and shocks propagate through the edges. Relying on the ideas of graphical models, I prove that if the variables satisfy a new notion called asymptotic conditional independence, then the density of the joint distribution can be simplified and expressed in terms of lower dimensional functions. This generalizes the Hammersley- Clifford theorem and enables us to infer tail distributions from observations in reduced dimension. As an illustration, extreme river flows are modeled by a tree graphical model whose structure appears to recover almost exactly the actual river network. A fundamental concept when studying limiting tail distributions is regular variation. I propose a new notion in the multivariate case called one-component regular variation, of which Karamata's and the representation theorem, two important results in the univariate case, are generalizations. Eventually, I turn my attention to website visit data and fit a censored copula Gaussian graphical model allowing the visualization of users' behavior by a graph.
APA, Harvard, Vancouver, ISO, and other styles
30

Hu, Yang. "Extreme Value Mixture Modelling with Simulation Study and Applications in Finance and Insurance." Thesis, University of Canterbury. Mathematics and Statistics, 2013. http://hdl.handle.net/10092/8538.

Full text
Abstract:
Extreme value theory has been used to develop models for describing the distribution of rare events. The extreme value theory based models can be used for asymptotically approximating the behavior of the tail(s) of the distribution function. An important challenge in the application of such extreme value models is the choice of a threshold, beyond which point the asymptotically justified extreme value models can provide good extrapolation. One approach for determining the threshold is to fit the all available data by an extreme value mixture model. This thesis will review most of the existing extreme value mixture models in the literature and implement them in a package for the statistical programming language R to make them more readily useable by practitioners as they are not commonly available in any software. There are many different forms of extreme value mixture models in the literature (e.g. parametric, semi-parametric and non-parametric), which provide an automated approach for estimating the threshold and taking into account the uncertainties with threshold selection. However, it is not clear that how the proportion above the threshold or tail fraction should be treated as there is no consistency in the existing model derivations. This thesis will develop some new models by adaptation of the existing ones in the literature and placing them all within a more generalized framework for taking into account how the tail fraction is defined in the model. Various new models are proposed by extending some of the existing parametric form mixture models to have continuous density at the threshold, which has the advantage of using less model parameters and being more physically plausible. The generalised framework all the mixture models are placed within can be used for demonstrating the importance of the specification of the tail fraction. An R package called evmix has been created to enable these mixture models to be more easily applied and further developed. For every mixture model, the density, distribution, quantile, random number generation, likelihood and fitting function are presented (Bayesian inference via MCMC is also implemented for the non-parametric extreme value mixture models). A simulation study investigates the performance of the various extreme value mixture models under different population distributions with a representative variety of lower and upper tail behaviors. The results show that the kernel density estimator based non-parametric form mixture model is able to provide good tail estimation in general, whilst the parametric and semi-parametric forms mixture models can give a reasonable fit if the distribution below the threshold is correctly specified. Somewhat surprisingly, it is found that including a constraint of continuity at the threshold does not substantially improve the model fit in the upper tail. The hybrid Pareto model performs poorly as it does not include the tail fraction term. The relevant mixture models are applied to insurance and financial applications which highlight the practical usefulness of these models.
APA, Harvard, Vancouver, ISO, and other styles
31

Franco, Villoria Maria. "Temporal and spatial modelling of extreme river flow values in Scotland." Thesis, University of Glasgow, 2013. http://theses.gla.ac.uk/4017/.

Full text
Abstract:
Extreme river flows can lead to inundation of floodplains, with consequent impacts for society, the environment and the economy. Flood risk estimates rely on river flow records, hence a good understanding of the patterns in river flow, and, in particular, in extreme river flow, is important to improve estimation of risk. In Scotland, a number of studies suggest a West to East rainfall gradient and increased variability in rainfall and river flow. This thesis presents and develops a number of statistical methods for analysis of different aspects of extreme river flows, namely the variability, temporal trend, seasonality and spatial dependence. The methods are applied to a large data set, provided by SEPA, of daily river flow records from 119 gauging stations across Scotland. The records range in length from 10 up to 80 years and are characterized by non-stationarity and long-range dependence. Examination of non-stationarity is done using wavelets. The results revealed significant changes in the variability of the seasonal pattern over the last 40 years, with periods of high and low variability associated with flood-rich and flood-poor periods respectively. Results from a wavelet coherency analysis suggest significant influence of large scale climatic indices (NAO, AMO) on river flow. A quantile regression model is then developed based on an additive regression framework using P-splines, where the parameters are fitted via weighted least squares. The proposed model includes a trend and seasonal component, estimated using the back-fitting algorithm. Incorporation of covariates and extension to higher dimension data sets is straightforward. The model is applied to a set of eight Scottish rivers to estimate the trend and seasonality in the 95th quantile of river flow. The results suggest differences in the long term trend between the East and the West and a more variable seasonal pattern in the East. Two different approaches are then considered for modelling spatial extremes. The first approach consists of a conditional probability model and concentrates on small subsets of rivers. Then a spatial quantile regression model is developed, extending the temporal quantile model above to estimate a spatial surface using the tensor product of the marginal B-spline bases. Residual spatial correlation using a Gaussian correlation function is incorporated into standard error estimation. Results from the 95th quantile fitted for individual months suggest changes in the spatial pattern of extreme river flow over time. The extension of the spatial quantile model to build a fully spatio-temporal model is briefly outlined and the main statistical issues identified.
APA, Harvard, Vancouver, ISO, and other styles
32

Wilson, Paul Sinclair. "A physical approach to statistical modelling with implications for extreme values." Thesis, Imperial College London, 2005. http://hdl.handle.net/10044/1/11958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Backman, Emil, and David Petersson. "Evaluation of methods for quantifying returns within the premium pension." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-288499.

Full text
Abstract:
Pensionsmyndigheten's (the Swedish Pensions Agency) current calculation of the internal rate of return for 7.7 million premium pension savers is both time and resource consuming. This rate of return mirrors the overall performance of the funded part of the pension system and is analyzed internally, but also reported to the public monthly and yearly based on differently sized data samples. This thesis aims to investigate the possibility of utilizing other approaches in order to improve the performance of these calculations. Further, the study aims to verify the results stemming from said calculations and investigate their robustness. In order to investigate competitive matrix methods, a sample of approaches are compared to the more classical numerical methods. The approaches are compared in different scenarios aimed to mirror real practice. The robustness of the results are then analyzed by a stochastic modeling approach, where a small error term is introduced aimed to mimic possible errors which could arise in data management. It is concluded that a combination of Halley's method and the Jacobi-Davidson algorithm is the most robust and high performing method. The proposed method combines the speed and robustness from numerical and matrix methods, respectively. The result show a performance improvement of 550% in time, while maintaining the accuracy of the current server computations. The analysis of error propagation suggests the output error to be less than 0.12 percentage points in 99 percent of the cases, considering an introduced error term of large proportions. In this extreme case, the modeled expected number of individuals with an error exceeding 1 percentage point is estimated to be 212 out of the whole population.
Pensionsmyndighetens nuvarande beräkning av internräntan för 7,7 miljoner pensionssparare är både tid- och resurskrävande. Denna avkastning ger en översikt av hur väl den fonderade delen av pensionssystemet fungerar. Detta analyseras internt men rapporteras även till allmänheten varje månad samt årligen baserat på olika urval av data. Denna uppsats avser att undersöka möjligheten att använda andra tillvägagångssätt för att förbättra prestanda för denna typ av beräkningar. Vidare syftar studien till att verifiera resultaten som härrör från dessa beräkningar och undersöka deras stabilitet. För att undersöka om det finns konkurrerande matrismetoder jämförs ett urval av tillvägagångssätt med de mer klassiska numeriska metoderna. Metoderna jämförs i flera olika scenarier som syftar till att spegla verklig praxis. Stabiliteten i resultaten analyseras med en stokastisk modellering där en felterm införs för att efterlikna möjliga fel som kan uppstå i datahantering. Man drar slutsatsen att en kombination av Halleys metod och Jacobi-Davidson-algoritmen är den mest robusta och högpresterande metoden. Den föreslagna metoden kombinerar hastigheten från numeriska metoder och tillförlitlighet från matrismetoder. Resultatet visar en prestandaförbättring på 550 % i tid, samtidigt som samma noggrannhet som ses i de befintliga serverberäkningarna bibehålls. Analysen av felutbredning föreslår att felet i 99 procent av fallen är mindre än 0,12 procentenheter i det fall där införd felterm har stora proportioner. I detta extrema fall uppskattas det förväntade antalet individer med ett fel som överstiger 1 procentenhet vara 212 av hela befolkningen.
APA, Harvard, Vancouver, ISO, and other styles
34

Engberg, Alexander. "An empirical comparison of extreme value modelling procedures for the estimation of high quantiles." Thesis, Uppsala universitet, Statistiska institutionen, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297063.

Full text
Abstract:
The peaks over threshold (POT) method provides an attractive framework for estimating the risk of extreme events such as severe storms or large insurance claims. However, the conventional POT procedure, where the threshold excesses are modelled by a generalized Pareto distribution, suffers from small samples and subjective threshold selection. In recent years, two alternative approaches have been proposed in the form of mixture models that estimate the threshold and a folding procedure that generates larger tail samples. In this paper the empirical performances of the conventional POT procedure, the folding procedure and a mixture model are compared by modelling data sets on fire insurance claims and hurricane damage costs. The results show that the folding procedure gives smaller standard errors of the parameter estimates and in some cases more stable quantile estimates than the conventional POT procedure. The mixture model estimates are dependent on the starting values in the numerical maximum likelihood estimation, and are therefore difficult to compare with those from the other procedures. The conclusion is that none of the procedures is overall better than the others but that there are situations where one method may be preferred.
APA, Harvard, Vancouver, ISO, and other styles
35

Dawkins, Laura Claire. "Statistical modelling of European windstorm footprints to explore hazard characteristics and insured loss." Thesis, University of Exeter, 2016. http://hdl.handle.net/10871/21791.

Full text
Abstract:
This thesis uses statistical modelling to better understand the relationship between insured losses and hazard footprint characteristics for European windstorms (extra- tropical cyclones). The footprint of a windstorm is defined as the maximum wind gust speed to occur at a set of spatial locations over the duration of the storm. A better understanding of this relationship is required because the most damaging historical windstorms have had footprints with differing characteristics. Some have a large area of relatively low wind gust speeds, while others have a smaller area of higher wind gust speeds. In addition, this insight will help to explain the surprising, sharp decline in European wind related losses in the mid 1990’s. This novel exploration is based on 5730 high resolution model generated historical footprints (1979-2012) representing the whole European domain. Functions of extreme footprint wind gust speeds, known as storm severity measures, are developed to represent footprint characteristics. Exploratory data analysis is used to compare which storm severity measures are most successful at classifying 23 extreme windstorms, known to have caused large insured losses. Summarising the footprint using these scalar severity measures, however, fails to capture different combinations of spatial scale and local intensity characteristics. To overcome this, a novel statistical model for windstorm footprints is developed, initially for pairs of locations using a bivariate Gaussian copula model; subsequently extended to represent the whole European domain using a geostatistical spatial model. Throughout, the distribution of wind gust speeds at each location is modelled using a left-truncated Generalised Extreme Value (GEV) distribution. Synthetic footprints, simulated from the geostatistical model, are then used in a sensitivity study to explore whether the local intensity or spatial dependence structure of a footprint has the most influence on insured loss. This contributes a novel example of sensitivity analysis applied to a stochastic natural hazards model. The area of the footprint exceeding 25ms−1 over land is the most successful storm severity measure at classifying extreme loss windstorms, ranking all 23 within the top 18% of events. Marginally transformed wind gust speeds are identified as being asymptotically independent and second-order stationary, allowing for the spatial dependence to be represented by a geostatistical covariance function. The geostatistical windstorm footprint model is able to quickly (∼3 seconds) simulate synthetic footprints which realistically represent joint losses throughout Europe. The sensitivity study identifies that the left-truncated GEV parameters have a greater influence on insured loss than the geostatistical spatial dependence parameters. The observed decline in wind related losses in the 1990’s can therefore be attributed to a change in the local intensity rather than the spatial structure of footprint wind gust speeds.
APA, Harvard, Vancouver, ISO, and other styles
36

Chailan, Romain. "Application of Scientific Computing and Statistical Analysis to address Coastal Hazards." Thesis, Montpellier, 2015. http://www.theses.fr/2015MONTS168/document.

Full text
Abstract:
L'étude et la gestion des risques littoraux sont plébiscitées par notre société au vu des enjeux économiques et écologiques qui y sont impliqués. Ces risques sont généralement réponse à des conditions environnementales extrêmes. L'étude de ces phénomènes physiques repose sur la compréhension de ces conditions rarement (voire nullement) observées.Dans un milieu littoral, la principale source d'énergie physique est véhiculée par les vagues. Cette énergie est responsable des risques littoraux comme l'érosion et la submersion qui évoluent à des échelles de temps différentes (événementielle ou long-terme). Le travail réalisé, situé à l'interface de l'analyse statistique, de la géophysique et de l'informatique, vise à apporter des méthodologies et outils aux décideurs en charge de la gestion de tels risques.En pratique, nous nous intéressons à mettre en place des méthodes qui prennent en compte non seulement un site ponctuel mais traitent les problématiques de façon spatiale. Ce besoin provient de la nature même des phénomènes environnementaux qui sont spatiaux, tels les champs de vagues.L'étude des réalisations extrêmes de ces processus repose sur la disponibilité d'un jeu de données représentatif à la fois dans l'espace et dans le temps, permettant de projeter l'information au-delà de ce qui a déjà été observé. Dans le cas particulier des champs de vagues, nous avons recours à la simulation numérique sur calculateur haute performance (HPC) pour réaliser un tel jeu de données. Le résultat de ce premier travail offre de nombreuses possibilités d'applications.En particulier, nous proposons à partir de ce jeu de données deux méthodologies statistiques qui ont pour but respectif de répondre aux problématiques de risques littoraux long-termes (érosion) et à celles relatives aux risques événementiels (submersion). La première s'appuie sur l'application de modèles stochastiques dit max-stables, particulièrement adapté à l'étude des événements extrêmes. En plus de l'information marginale, ces modèles permettent de prendre en compte la structure de dépendance spatiale des valeurs extrêmes. Nos résultats montrent l'intérêt de cette méthode au devant de la négligence de la dépendance spatiale de ces phénomènes pour le calcul d'indices de risque.La seconde approche est une méthode semi-paramétrique dont le but est de simuler des champs spatio-temporels d'états-de-mer extrêmes. Ces champs, interprétés comme des tempêtes, sont des amplifications contrôlées et bi-variés d'épisodes extrêmes déjà observés. Ils forment donc des tempêtes encore plus extrêmes. Les tempêtes simulées à une intensité contrôlée alimentent des modèles physiques événementiels à la côte, permettant d'aider les décideurs à l'anticipation de ces risques encore non observés.Enfin et depuis la construction de ces scenarii extrêmes, nous abordons la notion de pré-calcul dans le but d'apporter en quasi-temps réel au décideur et en tant de crise une prévision sur le risque littoral.L’ensemble de ce travail s'inscrit dans le cadre d'un besoin industriel d’aide à la modélisation physique : chainage de modèles numériques et statistiques. La dimension industrielle de cette thèse est largement consacrée à la conception et au développement d’un prototype de plateforme de modélisation permettant l’utilisation systématique d’un calculateur HPC pour les simulations et le chainage de modèles de façon générique.Autour de problématiques liées à la gestion du risque littoral, cette thèse démontre l'apport d'un travail de recherche à l'interface de plusieurs disciplines. Elle y répond en conciliant et proposant des méthodes de pointe prenant racine dans chacune de ces disciplines
Studies and management of coastal hazards are of high concerns in our society, since they engage highly valuable economical and ecological stakes. Coastal hazards are generally responding to extreme environmental conditions. The study of these physical phenomena relies on the understanding of such environmental conditions, which are rarely (or even never) observed.In coastal areas, waves are the main source of energy. This energy is responsible of coastal hazards developed at different time-scales, like the submersion or the erosion.The presented work, taking place at the interface between Statistical Analysis, Geophysics and Computer Sciences, aiming at bringing forward tools and methods serving decision makers in charge of the management of such risks.In practice, the proposed solutions answer to the questionings with a consideration of the space dimension rather than only punctual aspects. This approach is more natural considering that environmental phenomena are generally spatial, as the sea-waves fields.The study of extreme realisations of such processes is based on the availability of a representative data set, both in time and space dimensions, allowing to extrapolating information beyond the actual observations. In particular for sea-waves fields, we use numerical simulation on high performance computational clusters (HPC) to product such a data set. The outcome of this work offers many application possibilities.Most notably, we propose from this data set two statistical methodologies, having respective goals of dealing with littoral hazards long-terms questionings (e.g., erosion) and event-scale questionings (e.g., submersion).The first one is based on the application of stochastic models so-called max-stable models, particularly adapted to the study of extreme values in a spatial context. Indeed, additionally to the marginal information, max-stable models allow to take into account the spatial dependence structures of the observed extreme processes. Our results show the interest of this method against the ones neglecting the spatial dependence of these phenomena for risk indices computation.The second approach is a semi-parametric method aiming at simulating extreme waves space-time processes. Those processes, interpreted as storms, are controlled and bi-variate uplifting of already observed extreme episodes. In other words, we create most severe storms than the one already observed. These processes simulated at a controlled intensity may feed littoral physical models in order to describe a very extreme event in both space and time dimensions. They allow helping decision-makers in the anticipation of hazards not yet observed.Finally and from the construction of these extreme scenarios, we introduce a pre-computing paradigm in the goal of providing the decision-makers with a real-time and accurate information in case of a sudden coastal crisis, without performing any physical simulation.This work fits into a growing industrial demand of modelling help. Most notably a need related to the chaining of numerical and statistical models. Consequently, the industrial dimension of this PhD.~is mostly dedicated to the design and development of a prototype modelling platform. This platform aims at systematically using HPC resources to run simulations and easing the chaining of models.Embracing solutions towards questionings related to the management of coastal hazard, this thesis demonstrates the benefits of a research work placed at the interface between several domains. This thesis answers such questionings by providing end-users with cutting-edge methods stemming from each of those domains
APA, Harvard, Vancouver, ISO, and other styles
37

Pino, Coll Cristián Eduardo. "Integrated surface-subsurface hydrologic modeling to quantify groundwater recharge due to an extreme flooding event in the Atacama Desert." Tesis, Universidad de Chile, 2018. http://repositorio.uchile.cl/handle/2250/168077.

Full text
Abstract:
Magíster en Ciencias de la Ingeniería, Mención Recursos y Medio Ambiente Hídrico. Ingeniero Civil
En regiones áridas el agua subterránea constituye la principal fuente de agua para distintos usos. Teniendo en cuenta que la recarga principal de los acuíferos aluviales se origina durante eventos esporádicos de inundación que pueden ocurrir cada periodos secos prolongados sin ninguna recarga, es importante comprender la interacción entre el agua superficial y subterránea durante y después de dichos eventos para evaluar las tasas de recarga, gestionar y planificar de forma óptima el uso de los limitados recursos hídricos. La infiltración está controlada temporal y espacialmente por las fluctuaciones de la altura de escurrimiento, que junto a las propiedades de retención y condición de humedad del suelo determinan el gradiente hidráulico en la interface agua-sedimento y el volumen disponible para recargar. Por otro lado, en zonas áridas el espesor de la zona no saturada es típicamente grande, por lo que puede pasar un tiempo largo entre que ocurre la infiltración y se recarga el acuífero. Hoy en día, la medición in-situ de estos factores y la estimación espacio temporal de la recarga siguen siendo un desafío. Esto motiva la aplicación de un modelo totalmente acoplado de flujo superficial-subsuperficial basado en procesos físicos para estudiar los mecanismos de recarga durante y después de un evento extremo de inundación registrado en un valle aluvial del norte de Chile. El modelo calibrado para reproducir la tendencia general de los niveles de agua subterránea y caudal superficial observado permite investigar la magnitud y distribución temporal y espacial de la recarga del sistema, la cual se estimó en un 41\% del volumen de la crecida. Considerando diferentes parámetros superficiales y subterráneos, y diferentes condiciones iniciales de saturación se identifican variables y mecanismos que controlan la recarga originada por eventos extremos de inundación en el valle, y se extiende la discusión para mejoras en la aplicación de este tipo de modelos en sistemas hidrológicos similares, los cuales son comunes a otras regiones áridas del mundo.
APA, Harvard, Vancouver, ISO, and other styles
38

Demarta, Stefano. "The copula approach to modelling multivariate extreme values : theory and examples with financial applications in view /." Zürich : ETH, 2007. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=17307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Debbabi, Nehla. "Approche algébrique et théorie des valeurs extrêmes pour la détection de ruptures : Application aux signaux biomédicaux." Thesis, Reims, 2015. http://www.theses.fr/2015REIMS025/document.

Full text
Abstract:
Ce travail développe des techniques non-supervisées de détection et de localisation en ligne de ruptures dans les signaux enregistrés dans un environnement bruité. Ces techniques reposent sur l'association d'une approche algébrique avec la TVE. L'approche algébrique permet d'appréhender aisément les ruptures en les caractérisant en termes de distributions de Dirac retardées et leurs dérivées dont la manipulation est facile via le calcul opérationnel. Cette caractérisation algébrique, permettant d'exprimer explicitement les instants d'occurrences des ruptures, est complétée par une interprétation probabiliste en termes d'extrêmes : une rupture est un évènement rare dont l'amplitude associée est relativement grande. Ces évènements sont modélisés dans le cadre de la TVE, par une distribution de Pareto Généralisée. Plusieurs modèles hybrides sont proposés dans ce travail pour décrire à la fois le comportement moyen (bruit) et les comportements extrêmes (les ruptures) du signal après un traitement algébrique. Des algorithmes entièrement non-supervisés sont développés pour l'évaluation de ces modèles hybrides, contrairement aux techniques classiques utilisées pour les problèmes d'estimation en question qui sont heuristiques et manuelles. Les algorithmes de détection de ruptures développés dans cette thèse ont été validés sur des données générées, puis appliqués sur des données réelles provenant de différents phénomènes, où les informations à extraire sont traduites par l'apparition de ruptures
This work develops non supervised techniques for on-line detection and location of change-points in noisy recorded signals. These techniques are based on the combination of an algebraic approach with the Extreme Value Theory (EVT). The algebraic approach offers an easy identification of the change-points. It characterizes them in terms of delayed Dirac distributions and their derivatives which are easily handled via operational calculus. This algebraic characterization, giving rise to an explicit expression of the change-points locations, is completed with a probabilistic interpretation in terms of extremes: a change point is seen as a rare and extreme event. Based on EVT, these events are modeled by a Generalized Pareto Distribution.Several hybrid multi-components models are proposed in this work, modeling at the same time the mean behavior (noise) and the extremes ones (change-points) of the signal after an algebraic processing. Non supervised algorithms are proposed to evaluate these hybrid models, avoiding the problems encountered with classical estimation methods which are graphical ad hoc ones. The change-points detection algorithms developed in this thesis are validated on generated data and then applied on real data, stemming from different phenomenons, where change-points represent the information to be extracted
APA, Harvard, Vancouver, ISO, and other styles
40

Castellà, Sánchez Mercè. "Statistical modelling and analysis of summer very hot events in mainland Spain." Doctoral thesis, Universitat Rovira i Virgili, 2014. http://hdl.handle.net/10803/145723.

Full text
Abstract:
Extreme temperature events are of particular importance due to their severe impact on the environment, economy and the society. Focused on the uppermost percentiles of summer daily maximum (Tx) and minimum (Tn) temperatures,in this thesis a modelling and analysis of summer very hot days (VHD) and nights (VHN) over mainland Spain has been carried out applying the methodology of Point Process Approach based on Extreme Value Theory. It has been investigated whether large-scale variables of Sea Level Pressure, Sea Surface Temperature and soil moisture are associated to the occurrence and intensity of these exceptional events. Furthermore, the observed changes and trends in Tx and Tn extreme distribution have been analysed and different returns levels have been estimated. The link between the occurrence and the intensity of VHD and VHN with the large-scale anomalies has been demonstrated. Changes in extreme temperatures are generalized, but not homogeneous in space and time.
Els esdeveniments extrems de temperatura tenen una gran importància a causa del seu fort impacte al medi ambient, l'economia i la societat. Centrada en percentils elevats de temperatures màximes (Tx)i mínimes (Tn) diàries d'estiu, en aquesta tesi s’han modelitzat i analitzat els dies molt càlids(VHD) i les nits molt càlides (VHN)estivals a l'Espanya peninsular aplicant la metodologia de Point Process Approach basada en la Teoria dels Valors Extrems. S'ha investigat si les variables de gran escala de la pressió al nivell del mar,la temperatura superficial del mar i humitat del sòl estan associades amb l'ocurrència i la intensitat d'aquests esdeveniments excepcionals. A més, s'han analitzat els canvis i les tendències observades en la distribució d’extrems de Tx i Tni s’han estimat diferents nivells de reton. El vincle entre l'ocurrència i la intensitat dels VHDiVHN amb les anomalies de gran escala ha estat demostrat. Els canvis en les temperatures extremes són generalitzats, però no homogenis en l'espai i el temps.
APA, Harvard, Vancouver, ISO, and other styles
41

Olausson, Katrin. "On Evaluation and Modelling of Human Exposure to Vibration and Shock on Planing High-Speed Craft." Licentiate thesis, KTH, Marina system, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-159168.

Full text
Abstract:
High speed in waves, necessary in for instance rescue or military operations, often result in severe loading on both the craft and the crew. To maximize the performance of the high-speed craft (HSC) system that the craft and crew constitute, balance between these loads is essential. There should be no overload or underuse of crew, craft or equipment. For small high-speed craft systems, man is often the weakest link. The human exposure to vibration and shock results in injuries and other adverse health effects, which increase the risks for non-safe operations and performance degradation of the crew and craft system. To achieve a system in balance, the human acceleration exposure must be considered early in ship design. It must also be considered in duty planning and in design and selection of vibration mitigation systems. The thesis presents a simulation-based method for prediction and evaluation of the acceleration exposure of the crew on small HSC. A numerical seat model, validated with experimental full-scale data, is used to determine the crew's acceleration exposure. The input to the model is the boat acceleration expressed in the time domain (simulated or measured), the total mass of the seated human, and seat specific parameters such as mass, spring stiffness and damping coefficients and the seat's longitudinal position in the craft. The model generates seat response time series that are evaluated using available methods for evaluation of whole-body vibration (ISO 2631-1 \& ISO 2631-5) and statistical methods for calculation of extreme values. The presented simulation scheme enables evaluation of human exposure to vibration and shock at an early stage in the design process. It can also be used as a tool in duty planning, requirements specification or for design of appropriate vibration mitigation systems. Further studies is proposed within three areas: investigation of the actual operational profiles of HSC, further development of seat models and investigation of the prevailing injuries and health problems among the crew of HSC.

QC 20150126

APA, Harvard, Vancouver, ISO, and other styles
42

Beyene, Mussie Abraham. "Modelling the Resilience of Offshore Renewable Energy System Using Non-constant Failure Rates." Thesis, Uppsala universitet, Institutionen för elektroteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-445650.

Full text
Abstract:
Offshore renewable energy systems, such as Wave Energy Converters or an Offshore Wind Turbine, must be designed to withstand extremes of the weather environment. For this, it is crucial both to have a good understanding of the wave and wind climate at the intended offshore site, and of the system reaction and possible failures to different weather scenarios. Based on these considerations, the first objective of this thesis was to model and identify the extreme wind speed and significant wave height at an offshore site, based on measured wave and wind data. The extreme wind speeds and wave heights were characterized as return values after 10, 25, 50, and 100 years, using the Generalized Extreme Value method. Based on a literature review, fragility curves for wave and wind energy systems were identified as function of significant wave height and wind speed. For a wave energy system, a varying failure rate as function of the wave height was obtained from the fragility curves, and used to model the resilience of a wave energy farm as a function of the wave climate. The cases of non-constant and constant failure rates were compared, and it was found that the non-constant failure rate had a high impact on the wave energy farm's resilience. When a non-constant failure rate as a function of wave height was applied to the energy wave farm, the number of Wave Energy Converters available in the farm and the absorbed energy from the farm are nearly zero. The cases for non-constant and an averaged constant failure of the instantaneous non-constant failure rate as a function of wave height were also compared, and it was discovered that investigating the resilience of the wave energy farm using the averaged constant failure rate of the non-constant failure rate results in better resilience. So, based on the findings of this thesis, it is recommended that identifying and characterizing offshore extreme weather climates, having a high repair rate, and having a high threshold limit repair vessel to withstand the harsh offshore weather environment.
APA, Harvard, Vancouver, ISO, and other styles
43

Said, Khalil. "Mesures de risque multivariées et applications en science actuarielle." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSE1245.

Full text
Abstract:
L'entrée en application depuis le 1er Janvier 2016 de la réforme réglementaire européenne du secteur des assurances Solvabilité 2 est un événement historique qui va changer radicalement les pratiques en matière de gestion des risques. Elle repose sur une prise en compte importante du profil et de la vision du risque, via la possibilité d'utiliser des modèles internes pour calculer les capitaux de solvabilité et l'approche ORSA (Own Risk and Solvency Assessment) pour la gestion interne du risque. La modélisation mathématique est ainsi un outil indispensable pour réussir un exercice réglementaire. La théorie du risque doit être en mesure d'accompagner ce développement en proposant des réponses à des problématiques pratiques, liées notamment à la modélisation des dépendances et aux choix des mesures de risques. Dans ce contexte, cette thèse présente une contribution à l'amélioration de la gestion des risques actuariels. En quatre chapitres nous présentons des mesures multivariées de risque et leurs applications à l'allocation du capital de solvabilité. La première partie de cette thèse est consacrée à l'introduction et l'étude d'une nouvelle famille de mesures multivariées élicitables de risque qu'on appellera des expectiles multivariés. Son premier chapitre présente ces mesures et explique les différentes approches utilisées pour les construire. Les expectiles multivariés vérifient un ensemble de propriétés de cohérence que nous abordons aussi dans ce chapitre avant de proposer un outil d'approximation stochastique de ces mesures de risque. Les performances de cette méthode étant insuffisantes au voisinage des niveaux asymptotiques des seuils des expectiles, l'analyse théorique du comportement asymptotique est nécessaire, et fera le sujet du deuxième chapitre de cette partie. L'analyse asymptotique est effectuée dans un environnement à variations régulières multivariées, elle permet d'obtenir des résultats dans le cas des queues marginales équivalentes. Nous présentons aussi dans le deuxième chapitre le comportement asymptotique des expectiles multivariés sous les hypothèses précédentes en présence d'une dépendance parfaite, ou d'une indépendance asymptotique, et nous proposons à l'aide des statistiques des valeurs extrêmes des estimateurs de l'expectile asymptotique dans ces cas. La deuxième partie de la thèse est focalisée sur la problématique de l'allocation du capital de solvabilité en assurance. Elle est composée de deux chapitres sous forme d'articles publiés. Le premier présente une axiomatisation de la cohérence d'une méthode d'allocation du capital dans le cadre le plus général possible, puis étudie les propriétés de cohérence d'une approche d'allocation basée sur la minimisation d'indicateurs multivariés de risque. Le deuxième article est une analyse probabiliste du comportement de cette dernière approche d'allocation en fonction de la nature des distributions marginales des risques et de la structure de la dépendance. Le comportement asymptotique de l'allocation est aussi étudié et l'impact de la dépendance est illustré par différents modèles marginaux et différentes copules. La présence de la dépendance entre les différents risques supportés par les compagnies d'assurance fait de l'approche multivariée une réponse plus appropriée aux différentes problématiques de la gestion des risques. Cette thèse est fondée sur une vision multidimensionnelle du risque et propose des mesures de nature multivariée qui peuvent être appliquées pour différentes problématiques actuarielles de cette nature
The entry into force since January 1st, 2016 of Solvency 2, the European regulatory reform of insurance industry, is a historic event that will radically change the practices in risk management. It is based on taking into account the own risk profile and the internal view of risk through the ability to use internal models for calculating solvency capital requirement and ORSA (Own Risk and Solvency Assessment) approach for internal risk management. It makes the mathematical modeling an essential tool for a successful regulatory exercise. The risk theory must allow to support this development by providing answers to practical problems, especially those related to the dependence modeling and the choice of risk measures. In the same context, this thesis presents a contribution to improving the management of insurance risks. In four chapters we present multivariate risk measures and their application to the allocation of solvency capital. The first part of this thesis is devoted to the introduction and study of a new family of multivariate elicitable risk measures that we will call multivariate expectiles. The first chapter presents these measures and explains the different construction approaches. The multivariate expectiles verify a set of coherence properties that we also discuss in this chapter before proposing a stochastic approximation tool of these risk measures. The performance of this method is insufficient in the asymptotic levels of the expectiles thresholds. That makes the theoretical analysis of the asymptotic behavior necessary. The asymptotic behavior of multivariate expectiles is then the subject of the second chapter of this part. It is studied in a multivariate regular variations framework, and some results are given in the case of equivalent marginal tails. We also study in the second chapter of the first part the asymptotic behavior of multivariate expectiles under previous assumptions in the presence of a perfect dependence, or in the case of asymptotic independence. Finally, we propose using extreme values statistics some estimators of the asymptotic expectile in these cases. The second part of the thesis is focused on the issue of solvency capital allocation in insurance. It is divided into two chapters; each chapter consists of a published paper. The first one presents an axiomatic characterization of the coherence of a capital allocation method in a general framework. Then it studies the coherence properties of an allocation approach based on the minimization of some multivariate risk indicators. The second paper is a probabilistic analysis of the behavior of this capital allocation method based on the nature of the marginal distributions of risks and the dependence structure. The asymptotic behavior of the optimal allocation is also studied and the impact of dependence is illustrated using some selected models and copulas. Faced to the significant presence of dependence between the various risks taken by insurance companies, a multivariate approach seems more appropriate to build responses to the various issues of risk management. This thesis is based on a multidimensional vision of risk and proposes some multivariate risk measures that can be applied to several actuarial issues of a multivariate nature
APA, Harvard, Vancouver, ISO, and other styles
44

Nguyen, Van Minh. "Wireless Link Quality Modelling and Mobility Management for Cellular Networks." Phd thesis, Telecom ParisTech, 2011. http://tel.archives-ouvertes.fr/tel-00702798.

Full text
Abstract:
La qualité de communication dans un réseau sans fil est déterminée par la qualité du signal, et plus précisément par le rapport signal à interférence et bruit. Cela pousse chaque récepteur à se connecter à l'émetteur qui lui donne la meilleure qualité du signal. Nous utilisons la géométrie stochastique et la théorie des extrêmes pour obtenir la distribution de la meilleure qualité du signal, ainsi que celles de interférence et du maximum des puissances reçues. Nous mettons en évidence comment la singularité de la fonction d'affaiblissement modifie leurs comportements. Nous nous intéressons ensuite au comportement temporel des signaux radios en étudiant le franchissement de seuils par un processus stationnaire X (t). Nous démontrons que l'intervalle de temps que X (t) passe au-dessus d'un seuil γ → −∞ suit une distribution exponentielle, et obtenons 'egalement des r'esultats caract'erisant des franchissements par X (t) de plusieurs seuils adjacents. Ces r'esultats sont ensuite appliqu'es 'a la gestion de mobilit'e dans les r'eseaux cellulaires. Notre travail se concentre sur la fonction de 'handover measurement'. Nous identifions la meilleure cellule voisine lors d'un handover. Cette fonction joue un rôle central sur expérience perçue par l'utilisateur. Mais elle demande une coopération entre divers mécanismes de contrôle et reste une question difficile. Nous traitons ce problème en proposant des approches analytiques pour les réseaux émergents de types macro et pico cellulaires, ainsi qu'une approche d'auto- optimisation pour les listes de voisinage utilisées dans les réseaux cellulaires actuels.
APA, Harvard, Vancouver, ISO, and other styles
45

Kuo, Mei-yu, and 郭美榆. "Modeling Brand Shares with A Generalized Extreme Value Model." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/16263317446644276698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Mashishi, Daniel. "Modeling average monthly rainfall for South Africa using extreme value theory." Thesis, 2020. http://hdl.handle.net/10386/3399.

Full text
Abstract:
Thesis (M. Sc. (Statistics)) -- University of Limpopo, 2020
The main purpose of modelling rare events such as heavy rainfall, heat waves, wind speed, interest rate and many other rare events is to try and mitigate the risk that might arise from these events. Heavy rainfall and floods are still troubling many countries. Almost every incident of heavy rainfall or floods might result in loss of lives, damages to infrastructure and roads, and also financial losses. In this dissertation, the interest was in modelling average monthly rainfall for South Africa using extreme value theory (EVT). EVT is made up mainly of two approaches: the block maxima and peaks-over thresh old (POT). This leads to the generalised extreme value and the generalised Pareto distributions, respectively. The unknown parameters of these distri butions were estimated using the method of maximum likelihood estimators in this dissertation. According to goodness-of-fit test, the distribution in the Weibull domain of attraction, Gumbel domain and generalised Pareto distri butions were appropriate distributions to model the average monthly rainfall for South Africa. When modelling using the POT approach, the point process model suggested that some areas within South Africa might experience high rainfall in the coming years, whereas the GPD model suggested otherwise. The block maxima approach using the GEVD and GEVD for r-largest order statistics also revealed similar findings to that of the GPD. The study recommend that for future research on average monthly rainfall for South Africa the findings might be improved if we can invite the Bayesian approach and multivariate extremes. Furthermore, on the POT approach, time-varying covariates and thresholds are also recommended.
National Research Foundation (NRF) and South African Weather Service (SAWS)
APA, Harvard, Vancouver, ISO, and other styles
47

Anderson and 李坤峰. "Modeling Extreme Risk in Stock Markets:The Influence of Data Dependence and Choice of Optimal Threshold Level on Extreme Value Models." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/94340679393918205019.

Full text
Abstract:
碩士
國立高雄應用科技大學
金融資訊研究所
95
Value at Risk is a widespread tool of risk management recently. It is a value that measures the worst loss of asset under the particular confidence level and possessed of period. Moreover, it is a quantile describing the tail of distribution of financial return series in statistics. In empirical literature, most of financial data have some properties such as fat tails and volatility clustering. Thus, estimating Value at Risk by conventional method may underestimate the quantile as a result of fat tails. We estimate Value at Risk in stock market by using extreme value theory combine with time series models and thereby compared the performance of the conditional Value at Risk with unconditional Value at Risk. In addition, we investigate optimal threshold level among past experience, method argued by hall and method proposed by Danielsson. Then we experiment backtesting on Value at Risk estimator and evaluate efficiency of estimator by LR statistic. We backtest mentioned above on eleven stock indexes:Dow Jones industrial average, Nasdaq, S&P 500, Nikkei 225, Hang Seng index, A-Share, SSE A-Share,FTSE 100,CAC 40, DAX and KOSPI index. Our finding reveals that conditional Value at Risk fitted ARMA(p,q)-GARCH(1,1) performs better than unconditional Value at Risk, and extreme value theory performed better than traditional method. Furthermore, our empirical result displays the performance of conditional Value at Risk which threshold level is decided by Hall1990 and Danielsson1997, indeed improve on model which threshold level is decided by past experience.
APA, Harvard, Vancouver, ISO, and other styles
48

Tsai, Pei-Chen, and 蔡倍禎. "A GARCH-Extreme Value Theory Approach for Modeling Operational Risk Evidence from Taiwan Commercial Bank." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/40226989103920675150.

Full text
Abstract:
碩士
淡江大學
財務金融學系碩士班
100
Because of the fast-developing of finance environment recently, the importance of risk management in banking business has been heightened. This article takes several commercial banks in Taiwan during the period from 1995 to 2009 as example, analyses the tails’ characteristics of operational risk loss event. It measures the fat-tail loss by the POT model of EVT. We further use GARCH-EVT model to capture time varying feature of data, so that we can better understand the characteristic of operational risk.
APA, Harvard, Vancouver, ISO, and other styles
49

Chou, Yu-Hsiang, and 周愉翔. "Modeling Extreme Risk in Foreign Exchange Market─The Influence of Data Dependence and Choice of Optimal Threshold Level on Extreme Value Models." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/53696018443370064606.

Full text
Abstract:
碩士
國立屏東商業技術學院
國際企業所
95
Foreign exchange has becoming more and more important, because of the trend of internationalization. Understanding the extreme behavior of foreign exchange rate will help manage foreign exchange rate risk. Therefore, this thesis investigates the extreme behavior of foreign exchange rate in G10 members by applying Extreme Value Theory (EVT) to the tail of the distribution of the daily rate of return of foreign exchange rate. In addition, we compare the EVT models by evaluating the forecasting performance of VaR. And we also investigate the influence of data dependence and choice of optimal threshold level on extreme value models. The empirical results show that compare to Normal distribution, daily return of foreign exchange rate is more Fat-tailed and asymmetric. This indicates that the normality assumption will lead underestimation of VaR. In backtesting, the conditional EVT models outperform the others, which imply that the dependence and conditional heteroscedasticity of time series should be accounted for when applying EVT. On the other hand, EVT models do not be affected when the threshold level changes. And we find parametric models generally outperform the non-parametrics model, especially the parametric model – GPD. GPD has the best and comprehensive performance both under uncondition and condition model. Moreover, the empirical results also show that GARCH model is adequate in forecasting VaR of lower confidence level (eg. 95%), however, at a higher confidence level (eg. 99.5%, 99.9%), EVT models provide more reliable VaR forecasting. And we can say that the application of EVT models in risk management is essential.
APA, Harvard, Vancouver, ISO, and other styles
50

Masingi, Vusi Ntiyiso. "Modeling long-term monthly rainfall variability in selected provinces of South Africa using extreme value distributions." Thesis, 2021. http://hdl.handle.net/10386/3457.

Full text
Abstract:
Thesis (M.Sc. (Statistics)) -- University of Limpopo, 2020
Several studies indicated a growing trend in terms of frequency and severity of extreme events. Extreme rainfall could cause disasters that lead to loss of property and life. The aim of the study was to model the monthly rainfall variability in selected provinces of South Africa using extreme value distributions. This study investigated the best-fit probability distributions in the five provinces of South Africa. Five probability distributions: gamma, Gumbel, log-normal, Pareto and Weibull, were fitted and the best was selected from the five distributions for each province. Parameters of these distributions were estimated by the method of maximum likelihood estimators. Based on the Akaike information criteria (AIC) and Bayesian information criteria (BIC), the Weibull distribution was found to be the best-fit probability distribution for Eastern Cape, KwaZulu-Natal, Limpopo and Mpumalanga, while in Gauteng the best-fit probability distribution was found to be the gamma distribution. Monthly rainfall trends detected using the Mann–Kendall test revealed significant monotonic decreasing long-term trend for Eastern Cape, Gauteng and KwaZulu-Natal, and insignificant monotonic decreasing longterm trends for Limpopo and Mpumalanga. Non-stationary generalised extreme value distribution (GEVD) and non-stationary generalized Pareto distribution (GPD) were applied to model monthly rainfall data. The deviance statistic and likelihood ratio test (LRT) were used to select the most appropriate model. Model fitting supported stationary GEVD model for Eastern Cape, Gauteng and KwaZulu-Natal. On the other hand, model fitting supported non-stationary GEVD models for maximum monthly rainfall with nonlinear quadratic trend in the location parameter and a linear trend in the scale parameter for Limpopo, while in Mpumalanga the non-stationary GEVD model, which has a nonlinear quadratic trend in the scale parameter and no variation in the location parameter fitted well to the maximum monthly rainfall data. Results from the non-stationary GPD models showed that inclusion of the time covariate in our models was not significant for Eastern Cape, hence the bestfit model was the stationary GPD model. Furthermore, the non-stationary GPD model with a linear trend in the scale parameter provided the best-fit for KwaZulu-Natal and Mpumalanga, while in Gauteng and Limpopo the nonstationary GPD model with a nonlinear quadratic trend in the scale parameter fitted well to the monthly rainfall data. Lastly, GPD with time-varying thresholds was applied to model monthly rainfall excesses, where a penalised regression cubic smoothing spline was used as a time-varying threshold and the GPD model was fitted to cluster maxima. The estimate of the shape parameter showed that the Weibull family of distributions is appropriate in modelling the upper tail of the distribution for Limpopo and Mpumalanga, while for Eastern Cape, Gauteng and KwaZulu-Natal, the exponential family of distributions was found to be appropriate in modelling the upper tail of the distribution. The dissertation contributes positively to the body of knowledge in extreme value theory application to rainfall data and makes recommendations to the government agencies on the long-term rainfall variability and their negative impact on the economy.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography