To see the other types of publications on this topic, follow the link: Statistic risk.

Dissertations / Theses on the topic 'Statistic risk'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Statistic risk.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Pouliot, William. "Two applications of U-Statistic type processes to detecting failures in risk models and structural breaks in linear regression models." Thesis, City University London, 2010. http://openaccess.city.ac.uk/1166/.

Full text
Abstract:
This dissertation is concerned with detecting failures in Risk Models and in detecting structural breaks in linear regression models. By applying Theorem 2.1 of Szyszkowicz on U-statistic type process, a number of weak convergence results regarding three weighted partial sum processes are established. It is shown that these partial sum processes share certain invariance properties; estimation risk does not affect their weak convergence results and they are also robust to asymmetries in the error process in linear regression models. There is also an application of the methods developed here to a four factor Capital Asset Pricing model where it is shown via the methods developed in Chapter 3 that manager stock selection abilities vary over time.
APA, Harvard, Vancouver, ISO, and other styles
2

Misák, Petr. "Možnosti řízení a minimalizace rizik technologie výroby stavebních materiálů a výrobků pomocí fuzzy logiky a dalších nástrojů risk managementu." Doctoral thesis, Vysoké učení technické v Brně. Fakulta stavební, 2014. http://www.nusl.cz/ntk/nusl-233814.

Full text
Abstract:
The thesis proposes management options and risk minimizing in the field of building materials production technologies and related products using fuzzy logic and other risk management tools. The thesis indicates why some methodologies are not commonly used. The main purpose of this work (thesis) is to propose possible upgrades of standard methods in process capability and risk minimizing related to building materials and products. Markov analysis and fuzzy Markov chains are applied.
APA, Harvard, Vancouver, ISO, and other styles
3

Follestad, Turid. "Stochastic Modelling and Simulation Based Inference of Fish Population Dynamics and Spatial Variation in Disease Risk." Doctoral thesis, Norwegian University of Science and Technology, Faculty of Information Technology, Mathematics and Electrical Engineering, 2003. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-41.

Full text
Abstract:

We present a non-Gaussian and non-linear state-space model for the population dynamics of cod along the Norwegian Skagerak coast, embedded in the framework of a Bayesian hierarchical model. The model takes into account both process error, representing natural variability in the dynamics of a population, and observational error, reflecting the sampling process relating the observed data to true abundances. The data set on which our study is based, consists of samples of two juvenile age-groups of cod taken by beach seine hauls at a set of sample stations within several fjords along the coast. The age-structure population dynamics model, constituting the prior of the Bayesian model, is specified in terms of the recruitment process and the processes of survival for these two juvenile age-groups and the mature population, for which we have no data. The population dynamics is specified on abundances at the fjord level, and an explicit down-scaling from the fjord level to the level of the monitored stations is included in the likelihood, modelling the sampling process relating the observed counts to the underlying fjord abundances.

We take a sampling based approach to parameter estimation using Markov chain Monte Carlo methods. The properties of the model in terms of mixing and convergence of the MCMC algorithm and explored empirically on the basis of a simulated data set, and we show how the mixing properties can be improved by re-parameterisation. Estimation of the model parameters, and not the abundances, is the primary aim of the study, and we also propose an alternative approach to the estimation of the model parameters based on the marginal posterior distribution integrating over the abundances.

Based on the estimated model we illustrate how we can simulate the release of juvenile cod, imitating an experiment conducted in the early 20th century to resolve a controversy between a fisherman and a scientist who could not agree on the effect of releasing cod larvae on the mature abundance of cod. This controversy initiated the monitoring programme generating the data used in our study.

APA, Harvard, Vancouver, ISO, and other styles
4

Eliasson, Hampus. "Values at Risk." Thesis, Uppsala universitet, Statistiska institutionen, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-347408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shen, Hanyang, Bizu Gelaye, Hailiang Huang, Marta B. Rondon, Sixto Sanchez, and Laramie E. Duncan. "Polygenic prediction and GWAS of depression, PTSD, and suicidal ideation/self-harm in a Peruvian cohort." Springer Nature, 2020. http://hdl.handle.net/10757/652459.

Full text
Abstract:
Genome-wide approaches including polygenic risk scores (PRSs) are now widely used in medical research; however, few studies have been conducted in low- and middle-income countries (LMICs), especially in South America. This study was designed to test the transferability of psychiatric PRSs to individuals with different ancestral and cultural backgrounds and to provide genome-wide association study (GWAS) results for psychiatric outcomes in this sample. The PrOMIS cohort (N = 3308) was recruited from prenatal care clinics at the Instituto Nacional Materno Perinatal (INMP) in Lima, Peru. Three major psychiatric outcomes (depression, PTSD, and suicidal ideation and/or self-harm) were scored by interviewers using valid Spanish questionnaires. Illumina Multi-Ethnic Global chip was used for genotyping. Standard procedures for PRSs and GWAS were used along with extra steps to rule out confounding due to ancestry. Depression PRSs significantly predicted depression, PTSD, and suicidal ideation/self-harm and explained up to 0.6% of phenotypic variation (minimum p = 3.9 × 10−6). The associations were robust to sensitivity analyses using more homogeneous subgroups of participants and alternative choices of principal components. Successful polygenic prediction of three psychiatric phenotypes in this Peruvian cohort suggests that genetic influences on depression, PTSD, and suicidal ideation/self-harm are at least partially shared across global populations. These PRS and GWAS results from this large Peruvian cohort advance genetic research (and the potential for improved treatments) for diverse global populations.
National Institutes of Health
Revisión por pares
APA, Harvard, Vancouver, ISO, and other styles
6

Agering, Harald. "True risk of illiquid investments." Thesis, KTH, Matematik (Inst.), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-233577.

Full text
Abstract:
Alternative assets are becoming a considerable portion of global financial markets. Some of these alternative assets are highly illiquid, and as such they may require more intricate methods for calculating risk and performance statistics accurately. Research on hedge funds has established a pattern of risk being understated and various measures of performance being overstated due to illiquidity of the assets. This paper sets out to prove the existence of such bias and presents methods for removing it. Four mathematical methods aiming to adjust statistics for sparse return series were considered, and an implementation was carried out for data on private equity, real estate and infrastructure assets. The results indicate that there are in general substantial adjustments made to the risk and performance statistics of the illiquid assets when using these methods. In particular, the volatility and market exposure were adjusted upwards while manager skill and risk-adjusted performance were adjusted downwards.
Alternativa tillgångsslag börjar utgöra en avsevärd del av globala finansiella marknader. Vissa av dessa alternativa tillgångsslag är mycket illikvida och kan som sådana kräva mer avancerade metoder för att beräkna nyckeltal för risk och utveckling mer korrekt. Forskning på hedgefonder har kunnat påvisa ett mönster där risk underskattas medan olika nyckeltal för utveckling överskattas till följd av tillgångarnas illikviditet. Målet med denna artikel är att påvisa förekomsten av sådan systematisk avvikelse samt att presentera metoder för att avlägsna densamma. Fyra matematiska metoder framtagna för att justera nyckeltal för glesa dataserier användes, och metoderna implementerades på data för tillgångar i private equity, fastigheter samt infrastruktur. Resultaten antyder att det generellt sett sker betydande justeringar av nyckeltalen för risk och utveckling för de illikvida tillgångsslagen när man tillämpar dessa metoder. Mer specifikt justerades volatiliteten och marknadsexponeringen uppåt medan förvaltarens förmåga och den riskjusterade avkastningen justerades nedåt.
APA, Harvard, Vancouver, ISO, and other styles
7

Svindland, Gregor. "Convex Risk Measures Beyond Bounded Risks." Diss., lmu, 2009. http://nbn-resolving.de/urn:nbn:de:bvb:19-97156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Tang, Zhaofeng. "Quantitative risk management under systematic and systemic risks." Diss., University of Iowa, 2019. https://ir.uiowa.edu/etd/7035.

Full text
Abstract:
The contemporary risk management practice emphasizes the interplay of multilevel risks, of which the systematic and systemic risks are considered the main culprits of catastrophic losses. With this in mind, this thesis investigates three important topics in quantitative risk management, in which the systematic and systemic risks play a devastating role. First of all, we center on the design of reinsurance policies that accommodate the joint interests of the insurer and reinsurer by drawing upon the celebrated notion of Pareto optimality in the context of a distortion-risk-measure-based model. Such a topic is of considerable practical interest in the current post financial crisis era when people have witnessed the significant systemic risk posed by the insurance industry and the vulnerability of insurance companies to systemic events. Specifically, we characterize the set of Pareto-optimal reinsurance policies analytically and introduce the Pareto frontier to visualize the insurer-reinsurer trade-off structure geometrically. Another enormous merit of developing the Pareto frontier is the considerable ease with which Pareto-optimal reinsurance policies can be constructed even in the presence of the insurer's and reinsurer's individual risk constraints. A strikingly simple graphical search of these constrained policies is performed in the special cases of value-at-risk and tail value-at-risk. Secondly, we propose probabilistic and structural characterizations for insurance indemnities that are universally marketable in the sense that they appeal to both policyholders and insurers irrespective of their risk preferences and risk profiles. We begin with the univariate case where there is a single risk facing the policyholder, then extend our results to the case where multiple possibly dependent risks co-exist according to a mixture structure capturing policyholder's exposure to systematic and systemic risks. Next, we study the asymptotic behavior of the loss from defaults of a large credit portfolio. We consider a static structural model in which latent variables governing individual defaults follow a mixture structure incorporating idiosyncratic, systematic, and systemic risks. The portfolio effect, namely the decrease in overall risk due to the portfolio size increase, is taken into account. We derive sharp asymptotics for the tail probability of the portfolio loss as the portfolio size becomes large and our main finding is that the occurrence of large losses can be attributed to either the common shock variable or systematic risk factor, whichever has a heavier tail. Finally, we extend the asymptotic study of loss from defaults of a large credit portfolio under an amalgamated model. Aiming at investigating the dependence among the risk components of each obligor, we propose a static structural model in which each obligor's default indicator, loss given default, and exposure at default are respectively governed by three dependent latent variables with exposure to idiosyncratic, systematic, and systemic risks. The asymptotic distribution as well as the asymptotic value-at-risk and expected shortfall of the portfolio loss are obtained. The results are further refined when a specific mixture structure is employed for latent variables.
APA, Harvard, Vancouver, ISO, and other styles
9

Sandberg, Martina. "Credit Risk Evaluation using Machine Learning." Thesis, Linköpings universitet, Statistik och maskininlärning, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-138968.

Full text
Abstract:
In this thesis, we examine the machine learning models logistic regression, multilayer perceptron and random forests in the purpose of discriminate between good and bad credit applicants. In addition to these models we address the problem of imbalanced data with the Synthetic Minority Over-Sampling Technique (SMOTE). The data available have 273 286 entries and contains information about the invoice of the applicant and the credit decision process as well as information about the applicant. The data was collected during the period 2015-2017. With AUC-values at about 73%some patterns are found that can discriminate between customers that are likely to pay their invoice and customers that are not. However, the more advanced models only performed slightly better than the logistic regression.
APA, Harvard, Vancouver, ISO, and other styles
10

Ljung, Carl. "Copula selection and parameter estimation in market risk models." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-204420.

Full text
Abstract:
In this thesis, literature is reviewed for theory regarding elliptical copulas (Gaussian, Student’s t, and Grouped t) and methods for calibrating parametric copulas to sets of observations. Theory regarding model diagnostics is also summarized in the thesis. Historical data of equity indices and government bond rates from several geo-graphical regions along with U.S. corporate bond indices are used as proxies of the most significant stochastic variables in the investment portfolio of If P&C. These historical observations are transformed into pseudo-uniform observations, pseudo-observations, using parametric and non-parametric univariate models. The parametric models are fitted using both maximum likelihood and least squares of the quantile function. Ellip-tical copulas are then calibrated to the pseudo-observations using the well known methods Inference Function for Margins (IFM) and Semi-Parametric (SP) as well as compositions of these methods and a non-parametric estimator of Kendall’s tau.The goodness-of-fit of the calibrated multivariate models is assessed in aspect of general dependence, tail dependence, mean squared error as well as by using universal measures such as Akaike and Bayesian Informa-tion Criterion, AIC and BIC. The mean squared error is computed both using the empirical joint distribution and the empirical Kendall distribution function. General dependence is measured using the scale-invariant measures Kendall’s tau, Spearman’s rho, and Blomqvist’s beta, while tail dependence is assessed using Krup-skii’s tail-weighted measures of dependence (see [16]). Monte Carlo simulation is used to estimate these mea-sures for copulas where analytical calculation is not feasible.Gaussian copulas scored lower than Student’s t and Grouped t copulas in every test conducted. However, not all test produced conclusive results. Further, the obtained values of the tail-weighted measures of depen-dence imply a systematically lower tail dependence of Gaussian copulas compared to historical observations.
I den här uppsatsen granskas teori angående elliptiska copulas (Gaussisk, Students t och s.k. Grupperad t) och metoder för att kalibrera parametriska copulas till stickprov av observationer. Uppsatsen summerar även teori kring olika metoder för att analysera och jämföra copula-modeller. Historisk data av aktieindex och stats-obligationer från flera olika geografiska områden samt Amerikanska index för företagsobligationer används för att modellera de huvudsakliga stokastiskt drivande variablerna i investeringsportföljen hos If P&C. Des-sa historiska observationer transformeras med parametriska och icke-parametriska univariata modeller till pseudolikformiga observationer, pseudo-observationer. De parametriska modellerna passas till data med bå-de maximum likelihood och med minsta-kvadratpassning av kvantilfunktionen. Därefter kalibreras elliptiska copulas till pseudo-observationerna med de välkända metoderna Inference Function for Margins (IFM) och Semi-Parametric (SP) samt med blandningar av dessa två metoder och den icke-parametriska estimatorn av Kendalls tau.Hur väl de kalibrerade multivariata modellerna passar de historiska data utvärderas med avseende på ge-nerellt beroende, svansberoende, rotmedelavvikelse samt genom att använda mer allmäna mått som Akaike och Bayesianskt informationskriterium, AIC och BIC. Rotmedelavvikelsen räknas ut både genom att använda den empiriska gemensamma fördelningen och den empiriska Kendall fördelningsfunktionen. Generellt bero-ende mäts med de skalinvarianta måtten Kendalls tau, Spearmans rho och Blomqvists beta, medan svansbe-roende utvärderas med Krupskiis svansviktade beroendemått (se [16]). I de fall där analytiska beräkningsme-toder inte är möjliga för copulas används Monte Carlo-simulering för att skatta dessa mått.De Gaussiska copulas gav sämre resultat än Students t och Grupperad t copulas i varje enskilt test som utfördes. Dock så kan ej alla testresultat anses vara absolut definitiva. Vidare så antyder de erhållna värde-na från de svansviktade beroendemåtten att modellering med Gaussisk copula resulterar i systematiskt lägre svansberoende hos modellen än hos de historiska observationerna.
APA, Harvard, Vancouver, ISO, and other styles
11

Lindell, Andreas. "Theoretical and Practical Applications of Probability : Excursions in Brownian Motion, Risk Capital Stress Testing, and Hedging of Power Derivatives." Doctoral thesis, Stockholm : Department of Mathematics, Stockholm university, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-8570.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Du, Toit Carl. "Modelling market risk with SAS Risk Dimensions : a step by step implementation." Thesis, Link to the online version, 2005. http://hdl.handle.net/10019/1015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ndoumbe, Ebongue Steve Armand. "The risk model for insurance portfolio has been adopted to portfolio of derivatives. Describe the models and compare with a focus on the differences." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-11293.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Kuritzén, Felix. "Alternative Methods of Estimating Investor´s Risk Appetite." Thesis, KTH, Matematisk statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-252564.

Full text
Abstract:
In this thesis three risk appetite indexes are derived and measured from the beginning of 2006 to the end of the first quarter in 2019. One of the risk appetite indexes relies on annualized returns and volatilities from risky and safe assets while the others relies on subjective and risk neutral probability distributions. The distributions are obtained from historical data on equity indexes and from a wide spectrum of option prices with one month until the options expires. All data is provided by Refinitiv through Öhman Fonder. The indexes studied throughout the thesis is provided by authors from financial institutions such as Bank of England, Bank of International Settlements and Credit Suisse First Boston. I conclude in this thesis that the Credit Suisse First Boston index and the Bank of International Settlements index generated the most intuitive result regarding expected response after major financial events. A principal component analysis demonstrated that the Credit Suisse First Boston index held most of the information in terms of explanation of variance. At last, the indexes was used as a trend-following strategy for asset allocation for switching between a safe versus a risky portfolio. A trend in the risk appetite was studied for 2 to 12 months back in time and resulted in that all of the risk appetite indexes studied throughout the thesis can be a helpful tool to asset allocation.
I denna rapport studeras tre riskaptit index från början av år 2006 till slutet av första kvartalet år 2019. Ett av riskaptit indexen är beroende av årlig avkastning och volatilitet hos flera olika riskabla och säkra tillgångar medan de andra två är beroende på subjektiva och risk neutrala sannolikhets-fördelningar. Fördelningarna erhålls från historisk data från olika aktieindex och från ett brett spektrum av options priser med en månad till optionerna förfaller. All data kommer från Refinitiv genom Öhman Fonder. Indexen som studeras i rapporten är ursprungligen härledda av författare från finansiella institut som Bank of England, Bank of International Settlements och Credit Suisse First Boston. I denna rapport kommer jag fram till att indexen från Credit Suisse First Boston och Bank of International Settlements genererar det mest intuitiva resultatet beträffande förväntningar efter större finansiella händelser. En principal komponent analys visade på att Credit Suisse First Bostons index innehöll mest information i form av förklaring av variansen. Tills sist så användes riskaptit indexen som en trendföljande strategi för tillgångsallokering mellan en säker och en riskfylld portfölj. Trenden i riskaptiten studerades från 2 till 12 månader bak i tiden och resultatet visar på att alla undersökta riskaptit index i denna rapport kan fungera som ett verktyg för tillgångsallokering.
APA, Harvard, Vancouver, ISO, and other styles
15

Lövgren, Andreas, and Joakim Strandberg. "Jämförande av risk för omoperation mellan två operationsmetoder vid ljumskbråck : En tillämpning av överlevnadsanalys." Thesis, Umeå universitet, Statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-160809.

Full text
Abstract:
När män primäropereras för ljumskbråck är praxis att göra det med metoden öppet nät. Om de senare behöver opereras om kan inte öppet nät användas igen utan då används vanligen titthålsmetoder. Vissa blir dock primäropererade med titthålsmetoder. Ungefär en tiondel av alla bråckoperationer i Sverige är en omoperation. Med hjälp av data från Svenskt Bråckregister undersöker denna studie om det finns någon skillnad i risk för omoperation beroende på operationsmetod. För att undersöka det används överlevnadsanalys där hazard ratio för operationsmetod är av intresse. En Cox Proportional Hazard-modell skattades och proportional hazard antagandet kontrollerades. Då proportional hazard ej ansågs uppfyllt för Cox PH skattades istället en Extended Cox model med två heaviside-funktioner. Resultatet av studien är att titthål har 3.3 gånger så hög hazard gentemot öppet nät under de första 460 dagarna efter primäroperationen, medan titthål har 1.4 gånger så hög hazard gentemot öppet nät efter 460 dagar.
APA, Harvard, Vancouver, ISO, and other styles
16

Kroon, Rodney Stephen. "A framework for estimating risk." Thesis, Link to the online version, 2008. http://hdl.handle.net/10019.1/1104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Styrud, Lovisa. "Risk Premium Prediction of Car Damage Insurance using Artificial Neural Networks and Generalized Linear Models." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208309.

Full text
Abstract:
Over the last few years the interest in statistical learning methods, in particular artificial neural networks, has reawakened due to increasing computing capacity, available data and a strive towards automatization of different tasks. Artificial neural networks have numerous applications, why they appear in various contexts. Using artificial neural networks in insurance rate making is an area in which a few pioneering studies have been conducted, with promising results. This thesis suggests using a multilayer perceptron neural network for pricing car damage insurance. The MLP is compared with two traditionally used methods within the framework of generalized linear models. The MLP was selected by cross-validation of a set of candidate models. For the comparison models, a log-link GLM with Tweedie's compound Poisson distribution modeling the risk premium as dependent variable was set up, as well as a two-parted GLM with a log-link Poisson GLM for claim frequency and a log-link Gamma GLM for claim severity. Predictions on an independent test set showed that the Tweedie GLM had the lowest prediction error, followed by the MLP model and last the Poisson-Gamma GLM. Analysis of risk ratios for the different explanatory variables showed that the Tweedie GLM was also the least discriminatory model, followed by the Poisson-Gamma GLM and the MLP. The MLP had the highest bootstrap estimate of variance in prediction error on the test set. Overall however, the MLP model performed roughly in line with the GLM models and given the basic model configurations cross-validated and the restricted computing power, the MLP results should be seen as successful for the use of artificial neural networks in car damage insurance rate making. Nevertheless, practical aspects argue in favor of using GLM. This thesis is written at If P&C Insurance, a property and casualty insurance company active in Scandinavia, Finland and the Baltic countries. The headquarters are situated in Bergshamra, Stockholm.
De senaste åren har det skett en dramatisk ökning av intresset för metoder inom statistisk inlärning, speciellt artificiella neurala nät. Anledningar till detta är ökad datorkapacitet och tillgänglig data samt en önskan om att effektivisera olika typer av uppgifter. Artificiella neurala nät har en mängd olika tillämpningsområden och återfinns därför i olika kontexter. Användandet av artificiella neurala nät för prissättning av försäkringar är ett område inom vilket det har utförts ett antal inledande studier med lovande resultat. I den här masteruppsatsen används en multilayer perceptron för att prissätta vagnskadeförsäkring och jämförs med två vanliga metoder för prissättning genom generaliserade linjära modeller. MLP-modellen valdes ut genom korsvalidering av en uppsättning tänkbara modeller. För jämförelse sattes en GLM-modell med logaritmisk länkfunktion och Tweedies sammansatta poissonfördelning upp där den beroende variabeln utgörs av riskpremien, samt en tvådelad GLM-modell innefattande en poissonfördelad GLM med logaritmisk länk för skadefrekvensen och en gammafördelad GLM med logaritmisk länk för skadestorleken. Prediktioner på oberoende testdata visade att Tweedie GLM-modellen hade det lägsta prediktionsfelet följt av MLP-modellen och sist Poisson-Gamma GLM-modellen. Analys av riskkvoter för de olika förklarande variablerna visade att Tweedie GLM-modellen också var den minst diskriminerande modellen, följt av Poisson-Gamma GLM-modellen och MLP-modellen. MLP-modellen hade den högsta bootstrappade uppskattningen av prediktionsfelet på testdatat. På det hela taget visade dock MLP-modellen resultat ungefär i linje med GLM-modellerna och givet de enkla nätverksstrukturer som korsvaliderats samt begränsningen i datorkapacitet bör ändå MLP-resultaten ses som en framgång för användandet av neurala nät inom prissättning av vagnskadeförsäkring. Dock finns det stora praktiska fördelar med generaliserade linjära modeller. Denna masteruppsats har skrivits för If Skadeförsäkring, ett försäkringsbolag med kunder i Skandinavien, Finland och Baltikum. Huvudkontoret ligger i Bergshamra, Stockholm.
APA, Harvard, Vancouver, ISO, and other styles
18

Nilsson, Joachim, and Gabriel Adéla. "Reducering utav enkät : Risk mot icke-risk." Thesis, Linköpings universitet, Statistik och maskininlärning, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-179203.

Full text
Abstract:
I denna rapport kommer det jämföras tre modeller inom tre olika metoder som är “Klassisk test teori”, “Itemrespons theory” och “Forward selection” för att undersöka ifall det är möjligt att minska antalet frågor ner tillcirka fyra frågor och ändå kunna prediktera de utfall som erhåller ingen risk i en enkät om spelproblematik medgod säkerhet. För varje metod så kommer det presenteras en modell med två frågor, en modell med fyra frågoroch slutligen en modell med sex frågor samt dess precision på hur väl de kan prediktera de med ingen riskkorrekt. Samtlig modellframtagning använder sig utav en träningsmängd utav datamaterialet och valideringensker på en testmängd, detta för att undvika överanpassning utav modeller. För att dessa metoder skall kunna prestera så bra som möjligt har en del databearbetning utförts så somatt hantera bortfall, extremvärden samt avgränsningar för att samtliga metoder skall fungera. Flera utav modellerna kan prediktera korrekt med över 90% säkerhet och slutgiltligen erhålls en modell inomforward selection metoden som med enbart fyra utav femton frågor kan prediktera 93,5% korrekt.
APA, Harvard, Vancouver, ISO, and other styles
19

Drakenward, Ellinor, and Emelie Zhao. "Modeling risk and price of all risk insurances with General Linear Models." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-275696.

Full text
Abstract:
Denna kandidatexamen ligger inom området matematisk statistik. I samarbete med försäkringsbolaget Hedvig syftar denna avhandling till att utforska en ny metod för hantering av Hedvigs försäkringsdata genom att bygga en prissättningsmodell för alla riskförsäkringar med generaliserade linjära modeller. Två generaliserade linjära modeller byggdes, där den första förutspår frekvensen för ett anspråk och den andra förutspår svårighetsgraden. De ursprungliga uppgifterna delades in i 9 förklarande variabler. Båda modellerna inkluderade fem förklarande variabler i början och reducerades sedan. Minskningen resulterade i att fyra av fem egenskaper var förklarande signifikanta i frekvensmodellen och endast en av de fem var förklarande signifikanta i svårighetsmodellen. Var och en av modellerna erhöll relativa risker för nivåerna av deras förklarande variabler. De relativa riskerna resulterade i en total risk för varje nivå. Genom multiplicering av en skapad basnivå med en uppsättning kombination av riskparametrar kan premien för en vald kund erhållas.
Det här kandidatexamensarbetet ligger inom ämnet matematisk statistik. Jag samarbete med försäkringsbolaget Hedvig, avser uppsatsen att undersöka en ny metod att hantera Hedvigs försäkringsdata genom att bygga en prissättningsmodell för drulleförsäkring med hjälp av generaliserade linjära modeller. Två modeller skapades varav den första förutsättningen frekvensen av ett försäkringsanspråk och den andra förutsäger storleken. Originaldatan var indelad i 9 förklarande variabler. Båda modellerna innehöll till en början fem förklarande variabler, vilka sedan reducerades till fyra respektive en variabler i de motsvarande modellerna. Från varje modell kunde sedan de relativa riskerna tas fram för varje kategori av de förklarande variablerna. Tillsammans bildades sedan totalrisken för alla grupper.
APA, Harvard, Vancouver, ISO, and other styles
20

Viktorsson, Johan. "The GARCH-copula model for gaugeing time conditional dependence in the risk management of electricity derivatives." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-209966.

Full text
Abstract:
In the risk management of electricity derivatives, time to delivery can be divided into a time grid, with the assumption that within each cell of the grid, volatility is more or less constant. This setup however does not take in to account dependence between the different cells in the time grid. This thesis tries to develop a way to gauge the dependence between electricity derivatives at the different places in the time grid and different delivery periods. More specifically, the aim is to estimate the size of the ratio of the quantile of the sum of price changes against the sum of the marginal quantiles of the price changes. The approach used is a combination of Generalised Autoregressive Conditional Heteroscedasticity (GARCH) processes and copulas. The GARCH process is used to filter out heteroscedasticity in the price data. Copulas are fitted to the filtered data using pseudo maximum likelihood and the fitted copulas are evaluated using a goodness of fit test. GARCH processes alone are found to be insufficient to capture the dynamics of the price data. It is found that combining GARCH with Autoregressive Moving Average processes provides better fit to the data. The resulting dependence is the found to be best captured by elliptical copulas. The estimated ratio is found to be quite small in the cases studied. The use of the ARMA-GARCH filtering gives in general a better fit for copulas when applied to financial data. A time dependency in the dependence can also be observed.
GARCH-copula modellen för att uppskatta tidsbetingat beroende vid riskhanteringen av eletricitetsderivat Vid riskhantering av elektrictitetsderivat kan, tid till leverans delas upp I ett rutnät med antagandet att volatiliteten kan anses konstant för varje ruta i nätet. Detta upplägg tar emellertid inte hänsyn till beroende mellan de olika rutorna i rutnätet. Detta examensarbeta försöker att utveckla en metod för att uppskatta detta beroende för eletricitetsderivat som befinner sig i på olika platser i rutnätet och som har olika leveransperioder. Mer specifikt är målet att uppskatta kvoten mellan kvantilen av summerade prisförändringar mot summan av de marginella kvantilerna hos prisförändringar. Angreppsättet är en kombination av så kallade Generelised Autoregressive Conditional Heteroscedasticity (GARCH) och så kallade copulas. GARCH processen används för att filtrera ut heteroskedicitet i prisdatan. Copulas passas till den filtrerade via pseduo maximum likelihood och ett test av anpassningens kvalitet tillämpas. GARCH processer allena visar sig vara otillräckliga för att fånga dynamiken i prisdatan. Det visar sig att en kombination av GARCH och autoregressive moving avergae (ARMA) processer ger en bättre anpassning till data. Det resulterande beroendet visar sig fångas bäst av elliptiska copulas. Den skattade kvoten visar sig vara rätt liten i de studerade fallen. Användingen av ARMA-GARCH visar sig också ge en bättre anpassning till copulas när de används till finansiell data. En tidsbetingning i beroendet kan också observeras.
APA, Harvard, Vancouver, ISO, and other styles
21

Ericson, Jesper, and Härje Widing. "Betydelsen av competing risk : En analys av demens utifrån Betulaprojektets datainsamling när konkurrerande utfall tas i beaktande." Thesis, Umeå universitet, Statistik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-149723.

Full text
Abstract:
Det har studerats mycket inom demensområdet och med hjälp av Betulaprojektets datainsamling har många avhandlingar och artiklar skrivit. Däremot har vikten att ta hänsyn till konkurrerande utfall (competing risk), så som avlida innan utvecklad demens, inte tagits upp i någon större utsträckning. Dessa konkurrerande utfall kan ha en betydande skillnad vid beräkning av sannolikheter att drabbas av demens. Därför är syftet med denna uppsats att beräkna sannolikheter och jämföra olika kategorier av människor och deras risk att drabbas av demens när konkurrerande utfall tas med i beräkningarna. Datat består av resultat från olika minnestester, antalet utbildningsår, en demensindikator, status avliden eller vid liv samt ålder vid de två utfallen som är av intresse. De resultat och slutsatser som analysen kom fram till var att det finns en signifikant skillnad att drabbas av demens mellan könsgrupperna där kvinnor löper en större risk att drabbas av demens. Vid beräkning av det konkurrerande utfallet, avlida, löpte männen en signifikant större risk att avlida innan utvecklingen av demens. När det kommer till utbildningsnivå upptäcktes kohortskillnader innan och efter skolplikten infördes 1962. Resultatet visade på en signifikant skillnad för högskoleutbildade när det kom till att avlida innan utvecklingen av demens. Analysen visade också på säkerställda skillnader beroende på poängresultat på ett utav minnestesterna där ett högre poängresultat medgav en lägre risk att drabbas av demens.
APA, Harvard, Vancouver, ISO, and other styles
22

Mattsson, Mathias. "Value at Risk estimation : A comparison between different models." Thesis, Uppsala universitet, Statistiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-447229.

Full text
Abstract:
In this thesis the performance of the quantile based CAV iaR models is evaluated and compared with GARCH models for predicting the Value at Risk. This is done by one step ahead out of sample prediction. The one step ahead out of sample prediction is done for the 500 observations at the end of the sample. To calculate the predictions a rolling forecast is used. This means that the sample that is used to do the one step ahead predictions is equally sized for all 500 predictions. Then tests are performed to evaluate the predictive power of the forecasts. The tests that are used to evaluate the predictions are: the dynamic quantile test, the Kupiec test and the Christoffersens test. The data that is used in the analysis are two stock indexes and one exchange rate index. What is concluded from the thesis is that the models perform good in general for the Stockholmsb ̈orsen data. For the First north data the 1% V aR produced too high risk predictions so the exceedance rate became too low. For the 5% V aR the predictions were more accurate. For the exchange rate data the predictions from the models were generally good as well.
APA, Harvard, Vancouver, ISO, and other styles
23

Barkhagen, Mathias. "Risk-Neutral and Physical Estimation of Equity Market Volatility." Licentiate thesis, Linköpings universitet, Produktionsekonomi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-94360.

Full text
Abstract:
The overall purpose of the PhD project is to develop a framework for making optimal decisions on the equity derivatives markets. Making optimal decisions refers e.g. to how to optimally hedge an options portfolio or how to make optimal investments on the equity derivatives markets. The framework for making optimal decisions will be based on stochastic programming (SP) models, which means that it is necessary to generate high-quality scenarios of market prices at some future date as input to the models. This leads to a situation where the traditional methods, described in the literature, for modeling market prices do not provide scenarios of sufficiently high quality as input to the SP model. Thus, the main focus of this thesis is to develop methods that improve the estimation of option implied surfaces from a cross-section of observed option prices compared to the traditional methods described in the literature. The estimation is complicated by the fact that observed option prices contain a lot of noise and possibly also arbitrage. This means that in order to be able to estimate option implied surfaces which are free of arbitrage and of high quality, the noise in the input data has to be adequately handled by the estimation method. The first two papers of this thesis develop a non-parametric optimization based framework for the estimation of high-quality arbitrage-free option implied surfaces. The first paper covers the estimation of the risk-neutral density (RND) surface and the second paper the local volatility surface. Both methods provide smooth and realistic surfaces for market data. Estimation of the RND is a convex optimization problem, but the result is sensitive to the parameter choice. When the local volatility is estimated the parameter choice is much easier but the optimization problem is non-convex, even though the algorithm does not seem to get stuck in local optima. The SP models used to make optimal decisions on the equity derivatives markets also need generated scenarios for the underlying stock prices or index levels as input. The third paper of this thesis deals with the estimation and evaluation of existing equity market models. The third paper gives preliminary results which show that, out of the compared models, a GARCH(1,1) model with Poisson jumps provides a better fit compared to more complex models with stochastic volatility for the Swedish OMXS30 index.
Det övergripande syftet med doktorandprojektet är att utveckla ett ramverk för att fatta optimala beslut på aktiederivatmarknaderna. Att fatta optimala beslut syftar till exempel på hur man optimalt ska hedga en optionsportfölj, eller hur man ska göra optimala investeringar på aktiederivatmarknaderna. Ramverket för att fatta optimala beslut kommer att baseras på stokastisk programmerings-modeller (SP-modeller), vilket betyder att det är nödvändigt att generera högkvalitativa scenarier för marknadspriser för en framtida tidpunkt som indata till SP-modellen. Detta leder till en situation där de traditionella metoderna, som finns beskrivna i litteraturen, för att modellera marknadspriser inte ger scenarier av tillräckligt hög kvalitet för att fungera som indata till SP-modellen. Följaktligen är huvudfokus för denna avhandling att utveckla metoder som, jämfört med de traditionella metoderna som finns beskrivna i litteraturen, förbättrar estimeringen av ytor som impliceras av en given mängd observerade optionspriser. Estimeringen kompliceras av att observerade optionspriser innehåller mycket brus och möjligen också arbitrage. Det betyder att för att kunna estimera optionsimplicerade ytor som är arbitragefria och av hög kvalitet, så behöver estimeringsmetoden hantera bruset i indata på ett adekvat sätt. De första två artiklarna i avhandlingen utvecklar ett icke-parametriskt optimeringsbaserat ramverk för estimering av högkvalitativa och arbitragefria options-implicerade ytor. Den första artikeln behandlar estimeringen av den risk-neutrala täthetsytan (RND-ytan) och den andra artikeln estimeringen av den lokala volatilitetsytan. Båda metoderna ger upphov till jämna och realistiska ytor för marknadsdata. Estimeringen av RND-ytan är ett konvext optimeringsproblem men resultatet är känsligt för valet av parametrar. När den lokala volatilitetsytan estimeras är parametervalet mycket enklare men optimeringsproblemet är icke-konvext, även om algoritmen inte verkar fastna i lokala optima. SP-modellerna som används för att fatta optimala beslut på aktiederivatmarknaderna behöver också indata i form av genererade scenarier för de underliggande aktiepriserna eller indexnivåerna. Den tredje artikeln i avhandlingen behandlar estimering och evaluering av existerande modeller för aktiemarknaden. Den tredje artikeln tillhandahåller preliminära resultat som visar att, av de jämförda modellerna, ger en GARCH(1,1)-modell med Poissonhopp en bättre beskrivning av dynamiken för det svenska aktieindexet OMXS30 jämfört med mer komplicerade modeller som innehåller stokastisk volatilitet.
APA, Harvard, Vancouver, ISO, and other styles
24

Hosini, Rebin. "Detection of high-risk shops in e- commerce." Thesis, Linköpings universitet, Statistik och maskininlärning, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-150191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Olaya, Bucaro Orlando. "Predicting risk of cyberbullying victimization using lasso regression." Thesis, Uppsala universitet, Statistiska institutionen, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-338767.

Full text
Abstract:
The increased online presence and use of technology by today’s adolescents has created new places where bullying can occur. The aim of this thesis is to specify a prediction model that can accurately predict the risk of cyberbullying victimization. The data used is from a survey conducted at five secondary schools in Pereira, Colombia. A logistic regression model with random effects is used to predict cyberbullying exposure. Predictors are selected by lasso, tuned by cross-validation. Covariates included in the study includes demographic variables, dietary habit variables, parental mediation variables, school performance variables, physical health variables, mental health variables and health risk variables such as alcohol and drug consumption. Included variables in the final model are demographic variables, mental health variables and parental mediation variables. Variables excluded in the final model includes dietary habit variables, school performance variables, physical health variables and health risk variables. The final model has an overall prediction accuracy of 88%.
APA, Harvard, Vancouver, ISO, and other styles
26

Aas, Kjersti. "Statistical Modelling of Financial Risk." Doctoral thesis, Norwegian University of Science and Technology, Department of Mathematical Sciences, 2007. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-1780.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Gougas, Khawla. "Risk factors impact on the P&L." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-284223.

Full text
Abstract:
Profit and Loss (P&L) explain analysis is an income statement produced by Product Control Team for traders to control the daily fluctuation in the value of a portfolio of trades to the root causes of the changes. This daily income provides users with a coherent breakdown of the drivers of P&L movements between two points in time with reference to a selected number of easily understandable pricing factors. P&L Attribution (also called P&L explain) can be calculated in two ways, either the risk based method or step re-evaluation method. This paper aims at understanding both methodologies from a theoretical point of view and shows the differences of both calculations methods and how they are interdependent in the daily work of a trader in the sense that both methods give a rational to the P&L from different perspectives. The risk based method involves the calculation of the trades sensitivities (also known as the Greeks) and then using them to predict the expected change in the P&L from one period to the next by using the actual market changes in the factors driving the transaction price over the same period and the transaction’s sensitivity to those factors. Whereas the re-evaluation method is calculated by aggregating the impact of different valuation scenarios and not on fixed sensitivities
Analys och rapportering av balansräkning ger användarna en sammanhängande uppdelning av drivkrafterna för P&L rörelser mellan två punkter i tid med hänvisning till ett urval av lättförståeliga prisfaktorer. P&L attribution kan beräknas på två sätt, känslighet och scenariobaserad metoder. Detta arbete syftar till att förstå båda metoderna ur ett teoretiskt perspektiv och visar skillnaderna och hur de beror på varandra. Känslighetsmetoden innebär beräkning av en handels känslighet (även känd som beräkning av grekerna) och sedan använda dem för att förutsäga den förväntade förändringen i P&L från en period till nästa genom att använda de faktiska marknadsförändringarna i de faktorer som driver transaktionspriset under samma period och transaktionens känslighet för dessa faktorer. Omvärderingsmetoden beräknas genom att olika värderingsscenariers inverkan sammanställs och inte på fasta känsligheter.
APA, Harvard, Vancouver, ISO, and other styles
28

Orrenius, Johan. "Optimal mass transport: a viable alternative to copulas in financial risk modeling?" Thesis, KTH, Matematik (Inst.), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231829.

Full text
Abstract:
Copulas as a description of joint probability distributions is today common when modeling financial risk. The optimal mass transport problem also describes dependence structures, although it is not well explored. This thesis explores the dependence structures of the entropy regularized optimal mass transport problem. The basic copula properties are replicated for the optimal mass transport problem. The estimation of the parameters of the optimal mass transport problem is attempted using a maximum likelihood analogy, but only successful when observing the general tendencies on a grid of the parameters.
Copulas som en beskrivning av simultanfördelning är idag en vanlig modell för finansiell risk. Optimala masstransport problemet beskriver också simultant beroende mellan fördelningar, även om det är mindre undersökt. Denna uppsats undersöker beroendestrukturer av det entropiregulariserade optimala masstransport problemet. De basala egenskaperna hos copulas är replikerade för det optimala masstransport problemet. Ett försök att skatta parametrarna i det optimala masstransport problemet görs med en maximum-likelihood liknande metod, men är endast framgångsrik i att uppsakata de generella tendenserna på en grid av parametrarna.
APA, Harvard, Vancouver, ISO, and other styles
29

Friedlander, Michael Arthur. "A robust non-time series approach for valuation of weather derivativesand related products." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B47147234.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Jesper, Brodin, and Kenny Nilsson. "Mitt i prick? : En utvärdering av SCB:s metod för befolkningsframskrivningar på riks- och lokal nivå." Thesis, Örebro universitet, Handelshögskolan vid Örebro universitet, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-16480.

Full text
Abstract:
I denna uppsats utvärderar författarna den metod Statistiska Centralbyrån använder sig av vid befolkningsframskrivningar. Metoden fungerar relativt bra vid prognoser över totalbefolkningen, men det visade sig att det blev problem med prognosen av individer i mycket unga och mycket äldre åldrar.
APA, Harvard, Vancouver, ISO, and other styles
31

Armerin, Fredrik. "Aspects of cash flow valuation /." Doctoral thesis, Stockholm, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-76.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Tamáskovics, Nándor, Günter Meier, Sarah Braun, and Bodo Schlesinger. "Statistisches Konzept zur Risikoanalyse von Tagesbrüchen über natürlichen und künstlichen Hohlräumen." Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2017. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-228430.

Full text
Abstract:
Die Nutzung von Flächen mit Altbergbau oder mit natürlichen Hohlräumen im Unter- und Baugrund ist mit erhöhten Risiken behaftet, dass Bauwerke durch unerwünschte Deformationen des Baugrundes in Mitleidenschaft gezogen werden. Eine typische Versagensart ist die Entwicklung von Tagesbrüchen oder Erdfällen, wobei sich Massen in Richtung von Hohlräumen im Unterund und Baugrund verlagern und auflockern. Die Umlagerung von Massen setzt sich solange fort, bis sich ein statisches Gleichgewicht einstellt und eine weitere Fortpflanzung des Bruchvorganges verhindert oder stark reduziert. Die Ermittlung der Versagenswahrscheinlichkeit an einem gegebenen Standort wird nach dem Konzept der geometrischen Wahrscheinlichkeit vorgeschlagen. Die Größe entstehender Tagesbrüche wird dabei als eine Zufallsvariable betrachtet. Die berechneten Versagenswahrscheinlichkeiten können als Grundlage der Risikobewertung von zu schützenden Objekten herangezogen werden
The use of sites over old mining regions or with natural openings in the ground includes an elevated technical risk, as constructions can be constrained due to unplanned deformations of the subsoil. Typical failure modes include pothole subsidence or earthfalls, when failing soil masses are displaced and loosened stepwise toward a collapsing opening in the ground. The displacement process continues until a stable static equilibrium is reached and a further propagation of displacements is prevented. The determination of the failure probability on a given site due to pothole subsidence is recommended based on the concept of geometric probabilities, considering the subsidence volume as a probabilistic quantity. The failure probabilities can be used for a risk analysis of protected objects on sites with expected pothole subsidence
APA, Harvard, Vancouver, ISO, and other styles
33

Hosseini, Mohamadreza. "Statistical models for agroclimate risk analysis." Thesis, University of British Columbia, 2009. http://hdl.handle.net/2429/16019.

Full text
Abstract:
In order to model the binary process of precipitation and the dichotomized temperature process, we use the conditional probability of the present given the past. We find necessary and sufficient conditions for a collection of functions to correspond to the conditional probabilities of a discrete-time categorical stochastic process X₁,X₂,···. Moreover we find parametric representations for such processes and in particular rth-order Markov chains. To dichotomize the temperature process, quantiles are often used in the literature. We propose using a two-state definition of the quantiles by considering the "left quantile" and "right quantile" functions instead of the traditional definition. This has various advantages such as a symmetry relation between the quantiles of random variables X and -X. We show that the left (right) sample quantile tends to the left (right) distribution quantile at p ∈[0,1], if and only if the left and right distribution quantiles are identical at p and diverge almost surely otherwise. In order to measure the loss of estimating (or approximating) a quantile, we introduce a loss function that is invariant under strictly monotonic transformations and call it the "probability loss function". Using this loss function, we introduce measures of distance among random variables that are invariant under continuous strictly monotonic transformations. We use this distance measures to show optimal overall fits to a random variable are not necessarily optimal in the tails. This loss function is also used to find equivariant estimators of the parameters of distribution functions. We develop an algorithm to approximate quantiles of large datasets which works by partitioning the data or use existing partitions (possibly of non-equal size). We show the deterministic precision of this algorithm and how it can be adjusted to get customized precisions. Then we develop a framework to optimally summarize very large datasets using quantiles and combining such summaries in order to infer about the original dataset. Finally we show how these higher order Markov models can be used to construct confidence intervals for the probability of frost-free periods.
APA, Harvard, Vancouver, ISO, and other styles
34

Boman, Victor. "A comparison of multivariate GARCH models with respect to Value at Risk." Thesis, Uppsala universitet, Statistiska institutionen, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385521.

Full text
Abstract:
Since the introduction univariate GARCH models number of available models have grown rapidly and has been extended to the multivariate area. This paper compares three different multivariate GARCH models and they are evaluated using out of sample Value at Risk of dif- ferent portfolios. Sector portfolios are used with different market capitalization. The models compared are the DCC,CCC and the GO-Garch model. The forecast horizon is 1-day, 5-day and 10-day ahead forecast of the estimated VaR limit. The DCC performs best with regards to both conditional anc unconditional violations of the VaR estimates.
APA, Harvard, Vancouver, ISO, and other styles
35

he, xiaofeng. "CREDIT CYCLE, CREDIT RISK AND BUSINESS CONDITIONS." NCSU, 2001. http://www.lib.ncsu.edu/theses/available/etd-20010718-110156.

Full text
Abstract:

We first present a Complex Singular Value Decomposition (CSVD)analysis of credit cyle and explore the lead-lag relation betweencredit cycle and business cycle, then propose a GeneralizedLinear Model (GLM) of credit rating transition probabilitiesunder the impact of business conditions.To detect the cyclic trend existence of credit condition in U.S.economy, all credit variables and business variables aretransformed to complex values and the transformed data matrix isapproximated by first order of CSVD analysis. We show that theeconomy, represented by both credit conditions and businessconditions, is changing recurrently but with different frequenciesfor different time periods. Credit variables making the greatestlinear contribution to first Principal Component can be identifiedas credit cycle indicators. The result of leading businessvariables to credit variables in an economy provides the basis topredict credit condition by business cycle indicators.The credit rating system is a publicly available measure of theriskiness of financial securities and a rating transition matrixquantifies the risk, by permitting calculation of the probabilityof downgrade or default. Credit migration is observed to beinfluenced both by business conditions and by an issuer's owncredit status. We assume the rating history for a particularinstitution is Markovian, and histories for differentinstitutions are assumed to be statistically independent, in bothcases the history of market conditions are known. With a simpleGLM, we investigate the significance of business conditions andtheir two major impacts - creditworthinessdeterioration/improvement and credit stability. We propose amodel of transition probability in discrete time and a model ofinstantaneous transition rates in continuous time, and fit themby maximum likelihood. Business conditions are shown to have asignificant effect: higher likelihood for credit qualityimprovement and stability under good business conditions whilehigher likelihood for credit quality deterioration and driftunder severe business conditions. The two business impacts aresignificant and business deterioration/improvement impact isgreater than its stability impact on credit rating transitions.Investment-grade rating transitions are more sensitive to longrate risk while speculative-grade rating transitions are moresensitive to short rate risk. Compared to a discrete model, thecontinuous transition model has much greater over-dispersion butis more practical.

APA, Harvard, Vancouver, ISO, and other styles
36

Fallman, David, and Jens Wirf. "FORECASTING FOREIGN EXCHANGE VOLATILITY FOR VALUE AT RISK : CAN REALIZED VOLATILITY OUTPERFORM GARCH PREDICTIONS?" Thesis, Uppsala universitet, Statistiska institutionen, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-146571.

Full text
Abstract:
In this paper we use model-free estimates of daily exchange rate volatilities employing high-frequency intraday data, known as Realized Volatility, which is then forecasted with ARMA-models and used to produce one-day-ahead Value-at-Risk predictions. The forecasting accuracy of the method is contrasted against the more widely used ARCH-models based on daily squared returns. Our results indicate that the ARCH-models tend to underestimate the Value-at-Risk in foreign exchange markets compared to models using Realized Volatility
APA, Harvard, Vancouver, ISO, and other styles
37

Johnson, David G. "Representations of uncertainty in risk analysis." Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/31941.

Full text
Abstract:
Uncertainty in situations involving risk is frequently modelled by assuming a plausible form of probability distribution for the uncertain quantities involved, and estimating the relevant parameters of that distribution based on the knowledge and judgement of informed experts or decision makers. The distributions assumed are usually uni-modal (and often bell-shaped) around some most likely value, with the Normal, Beta, Gamma and Triangular distributions being popular choices.
APA, Harvard, Vancouver, ISO, and other styles
38

Sjöstrand, Maria, and Özlem Aktaş. "Cornish-Fisher Expansion and Value-at-Risk method in application to risk management of large portfolios." Thesis, Högskolan i Halmstad, Tillämpad matematik och fysik (MPE-lab), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-16274.

Full text
Abstract:
One of the major problem faced by banks is how to manage the risk exposure in large portfolios. According to Basel II regulation banks has to measure the risk using Value-at-Risk with confidence level 99%. However, this regulation does not specify the way to calculate Valueat- Risk. The easiest way to calculate Value-at-Risk is to assume that portfolio returns are normally distributed. Altough, this is the most common way to calculate Value-at-Risk, there exists also other methods. The previous crisis shows that the regular methods are unfortunately not always enough to prevent bankruptcy. This paper is devoted to compare the classical methods of estimating risk with other methods such as Cornish-Fisher Expansion (CFVaR) and assuming generalized hyperbolic distribution. To be able to do this study, we estimate the risk in a large portfolio consisting of ten stocks. These stocks are chosen from the NASDAQ 100-list in order to have highly liquid stocks (bluechips). The stocks are chosen from different sectors to make the portfolio welldiversified. To investigate the impact of dependence between the stocks in the portfolio we remove the two most correlated stocks and consider the resulting eight stock portfolio as well. In both portfolios we put equal weight to the included stocks. The results show that for a well-diversified large portfolio none of the risk measures are violated. However, for a portfolio consisting of only one highly volatile stock we prove that we have a violation in the classical methods but not when we use the modern methods mentioned above.
APA, Harvard, Vancouver, ISO, and other styles
39

Nguyen, Ngoc Bien. "Adaptation via des inéqualités d'oracle dans le modèle de regression avec design aléatoire." Thesis, Aix-Marseille, 2014. http://www.theses.fr/2014AIXM4716/document.

Full text
Abstract:
À partir des observations Z(n) = {(Xi, Yi), i = 1, ..., n} satisfaisant Yi = f(Xi) + ζi, nous voulons reconstruire la fonction f. Nous évaluons la qualité d'estimation par deux critères : le risque Ls et le risque uniforme. Dans ces deux cas, les hypothèses imposées sur la distribution du bruit ζi serons de moment borné et de type sous-gaussien respectivement. En proposant une collection des estimateurs à noyau, nous construisons une procédure, qui est initié par Goldenshluger et Lepski, pour choisir l'estimateur dans cette collection, sans aucune condition sur f. Nous prouvons ensuite que cet estimateur satisfait une inégalité d'oracle, qui nous permet d'obtenir les estimations minimax et minimax adaptatives sur les classes de Hölder anisotropes
From the observation Z(n) = {(Xi, Yi), i = 1, ..., n} satisfying Yi = f(Xi) + ζi, we would like to approximate the function f. This problem will be considered in two cases of loss function, Ls-risk and uniform risk, where the condition imposed on the distribution of the noise ζi is of bounded moment and of type sub-gaussian, respectively. From a proposed family of kernel estimators, we construct a procedure, which is initialized by Goldenshluger and Lepski, to choose in this family a final estimator, with no any assumption imposed on f. Then, we show that this estimator satisfies an oracle inequality which implies the minimax and minimax adaptive estimation over the anisotropic Hölder classes
APA, Harvard, Vancouver, ISO, and other styles
40

Nicolau, González Guillermo. "Cortocircuitos en redes AT e impactos en distribución MT." Doctoral thesis, Universitat Ramon Llull, 2012. http://hdl.handle.net/10803/83709.

Full text
Abstract:
L’extensa implantació del control digital als entorns industrials, científics, comercials, professionals i domèstics ha revelat, d’ençà dues dècades, la gran sensibilitat d’aquests dispositius davant sobtats i breus descensos de tensió al subministrament elèctric de xarxa: aturades de plantes a processos productius, re – arrancades a processadors i sistemes de telecomunicació, etc.; i la causa sol esdevenir aparentment inexplicable pels usuaris. La normalització de les conseqüències, però, pot equivaler a un dia sencer de producció nul•la. L’ínfima correlació mostrada pels fenòmens esmentats amb anomalies al sistema elèctric proper (un client pot patir sèries conseqüències, per bé que el client veí només ha percebut una oscil•lació a l’enlluernat, i tots dos comparteixen la mateixa escomesa) sumada amb l’absència contrastada d’interrupció elèctrica suposà, al començament, un major grau d’incertesa, no només pels consumidors; també per a les empreses elèctriques. Fou necessari analitzar el problema en les seves vessants “microscòpica” i “macroscòpica” per a determinar la causa eficient: registrar la forma d’ona al punt de subministrament afectat i fer l’inventari de tots els incidents al Sistema Elèctric del mateix moment. La causa: els sots de tensió produïts per incidents elèctrics a xarxes remotes respecte el subministrament. Davallades sobtades (entre el 80 i el 10% del valor nominal) i ràpides (entre 10 ms i 1 s) al valor eficaç de la tensió subministrada, sense pas per “Zero”, produïdes, principalment, per curt - circuits perfectament detectats i eliminats a xarxes d’Alta Tensió (AT), molt allunyats de la conseqüència observada. A Catalunya, hom comptabilitzen afectacions davant curt - circuits a les interconnexions amb l’Aragó, Castelló i França. La present Tesi Doctoral estableix: • La metodologia per a modelar el Sistema Elèctric de Potència; • La sistematització del binomi causa (curt - circuit) – efecte (sot de tensió); • La personalització estadística de risc pel sot de tensió segons comarques; • Un sistema de protecció eficaç per a limitar la durada dels sots. La metodologia ha estat enfocada a la utilització sistemàtica, tal que per a cada curt - circuit esdevingut a la xarxa AT es pugui establir, en temps real, las capçaleres de subministrament afectades pel sot de tensió, així com la magnitud i la durada del mateix. L’entorn d’aplicació triat ha estat el Sistema Elèctric de Catalunya, per bé que la metodologia i sistemàtica són exportables, de forma natural, a qualsevol altre sistema elèctric trifàsic de corrent altern.
La implantación masiva del control digital en entornos industriales, científicos, comerciales, profesionales y domésticos ha puesto de manifiesto, durante los últimos veinte años, la gran sensibilidad de los mismos ante súbitos y breves descensos de tensión en la alimentación eléctrica procedente de la red: paradas de planta en procesos productivos, re – arranques en procesadores y sistemas de telecomunicación tienen lugar; y la causa de los mismos suele ser aparentemente inexplicable para los usuarios. La normalización de las consecuencias, en ocasiones, equivale a un día de producción nula. La escasa correlación mostrada por dichos fenómenos con anomalías en el sistema eléctrico cercano (un cliente padece consecuencias serias, mientras que el cliente vecino solamente ha percibido una oscilación en el alumbrado y ambos se alimentan del mismo tramo eléctrico) sumada con la ausencia contrastada de interrupción eléctrica supuso, en los inicios, un mayor grado de incertidumbre tanto para los consumidores como para las empresas eléctricas. Fue necesario analizar el problema a nivel “microscópico” y “macroscópico” para determinar la causa eficiente: registrar la forma de onda en el punto de suministro afectado y revisar todos los incidentes habidos en el Sistema Eléctrico en dicho instante. La causa: los huecos de tensión producidos por incidentes eléctricos en redes alejadas del suministro. Descensos súbitos (entre el 80 y el 10% del valor nominal) y rápidos (entre 10 ms y 1 s) en el valor eficaz de la tensión suministrada, sin paso por “cero” de la misma, producidos, principalmente, por cortocircuitos perfectamente detectados y eliminados en redes de Alta Tensión (AT), y situados muy lejos de la consecuencia observada. En el caso de Catalunya, se han contabilizado afectaciones ante cortocircuitos en interconexiones con Aragón, Castellón de la Plana y Francia. La presente Tesis Doctoral establece: • La metodología para modelar el Sistema Eléctrico de Potencia; • La sistematización para el binomio causa (cortocircuito) – efecto (hueco); • La personalización del riesgo estadístico de hueco vs. comarcas; • Un sistema protectivo eficaz para limitar duración de los huecos. Dicha metodología se ha orientado a la utilización sistemática, tal que para cada cortocircuito que tenga lugar en la red AT pueda establecerse, en tiempo real, las cabeceras de suministro afectadas por hueco de tensión, la magnitud y la duración del mismo. Como entorno de aplicación, se ha utilizado el Sistema Eléctrico de Catalunya, si bien la metodología y sistematización son exportables, de forma natural, a cualquier otro sistema eléctrico trifásico de corriente alterna.
The massive introduction of digital control in industrial, scientific, commercial, professional and domestic environments has revealed, over the last twenty years, the great sensitivity of them to sudden and short voltage dips in the electrical power grid: shutdowns of productive process plants, re - starts of processors and telecommunications systems take place, and the cause of them is often apparently inexplicable to the users. The normalization of the consequences sometimes is equivalent to a day without production. The weak correlation shown by these phenomena with anomalies in the nearby electrical system (i.e. in the same portion of a common distribution network, a customer may suffer serious consequences, while the adjacent customer has only percept a swing in the lighting) together with the absence of electrical power interruption represented, in the beginning, a great degree of uncertainty for both consumers and utilities. It was necessary to analyze the problem at the "microscopic" and "macroscopic" levels to determine the efficient cause: record the waveform at the affected plants and review all the disturbances occurred in the Power System at the same instant of time. The cause: voltage dips caused by electrical disturbances away from the supply. Sudden decreases (between 80 and 10% of the nominal value) and fast (between 10 ms and 1 s) in the supplied rms voltage, produced mainly by short-circuits perfectly detected and eliminated in High Voltage (HV) networks, and located far away from the observed consequence. In the case of Catalonia, affectations due to short-circuits in interconnects with Aragon, Castellón de la Plana and France have been recorded. This thesis provides: • A methodology useful to model the Power System; • A systematic analysis for cause – effect: from short – circuit to voltage dip; • A particularization voltage dip statistic risk for each county; • A reliable protective system to ensure time – limitation for voltage dips. The presented methodology is oriented to the systematic use, such that for every short - circuit that takes place in the HV network, the magnitude and duration of voltage dips that appear in the distribution can be established in real-time. As the application framework, the Catalan Power System is used, although the methodology and systematization are exportable, to any other alternating three-phase power system.
APA, Harvard, Vancouver, ISO, and other styles
41

Jiang, Jieyi Jiang. "Realistic Predictive Risk: The Role of Penalty and Covariate Diffusion in Model Selection." The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1503072235693181.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Park, Changyi. "Generalization error rates for margin-based classifiers." Connect to resource, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1124282485.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2005.
Title from first page of PDF file. Document formatted into pages; contains ix, 63 p.; also includes graphics (some col.). Includes bibliographical references (p. 60-63). Available online via OhioLINK's ETD Center
APA, Harvard, Vancouver, ISO, and other styles
43

Babuscia, Alessandra. "Statistical risk estimation for communication system design." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/76087.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2012.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 281-295).
Spacecraft are complex systems that involve many subsystems and multiple relationships among them. The design of a spacecraft is an evolutionary process that starts from requirements and evolves over time. During this process, changes can affect mass and power at component, subsystem, and system level. Each spacecraft has to respect overall constraints in terms of mass and power. The current practice in system design deals with this problem by allocating margins to individual components and to individual subsystems. However, a statistical characterization of the fluctuations in mass and power of the overall system (i.e. the spacecraft) is missing. This lack can result in a risky spacecraft design that might not fit the mission constraints and requirements, or in a conservative design that might not fully utilize the available resources. This problem is especially challenging at the initial stage of the design, when high levels of uncertainty due to lack of knowledge are unavoidable. This research proposes a statistical approach to quantify the likelihood that the design of a spacecraft would meet the mission constraints in mass and power consumption, focusing on the initial stage of the design. Due to the complexity of the problem and the different expertise required to develop a complete risk model for a spacecraft design, the scope of this research is focused on risk estimation for a specific spacecraft subsystem: the communication subsystem. The current research aims to be a "proof of concept" of a risk-based design approach, which can then be further expanded to the design of other subsystems as well as to the whole spacecraft. The approach presented in this thesis includes a baseline communication system design tool, and a statistical characterization of the design risks through a combination of historical mission data and expert opinion. Different statistical techniques are explored to ensure that the amount of information extracted from data and expert opinion is maximized. Specifically, for statistics based on data, Kernel Density Estimator is selected as the preferred technique to extract probability densities from a database of previous space missions' components. Expert elicitation is generated through a four-part model which quantifies experts' sensitivity to biases, and uses this measurement to compose properly the assessments from different experts. Finally, an optimization framework is developed to compare multiple possible design architectures, and to select the one that minimizes design objectives, like mass and power consumption, while minimizing the risk associated with the same metrics. Examples of missions are applied to validate the model. Results show that the statistical approach recognizes whether the initial estimate of the system is an overestimation or an underestimation, providing a valuable tool to measure the risk of a communication system at the initial state of the design. Specifically, statistics based on historical data and on expert elicitation allow the designer to size contingency properly, providing a reliable estimation of mass and power in the initial stage of the design. Thanks to this method, the communication system designers will be able to evaluate and compare different communication architectures in a risk trade-off prospective across the evolution of the design. Extensions to different subsystems and to additional metrics (like cost) make this model applicable to a wider range of problems.
by Alessandra Babuscia.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
44

Abbas, Sawsan. "Statistical methodologies for financial market risk management." Thesis, Lancaster University, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.547964.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Racheva-Iotova, Borjana. "An Integrated System for Market Risk, Credit Risk and Portfolio Optimization Based on Heavy-Tailed Medols and Downside Risk Measures." Diss., lmu, 2010. http://nbn-resolving.de/urn:nbn:de:bvb:19-123750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Rufer, Jiří. "Statistické modelování rizikových indikátorů firmy." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2019. http://www.nusl.cz/ntk/nusl-402655.

Full text
Abstract:
This thesis aims to analyze accounting and financial indicators using time series methods and interval regression analysis for Rudolf Jelínek, a.s. In this thesis are analyzed development trends of individual indicators. Based on the obtained data, the company deals with the risks of the company based on analyzes and their solutions.
APA, Harvard, Vancouver, ISO, and other styles
47

Herrera, Rodrigo. "Statistics of Multivariate Extremes with Applications in Risk Management." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2009. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-24962.

Full text
Abstract:
The contributions of this thesis have mainly a dual purpose: introducing several multivariate statistical methodologies where in the major of the cases only stationary of the random variables is assumed, and also highlight some of the applied problems in risk management where extreme value theory may play a role. Mostly every chapter is selfcontained, they have its own more detailed introduction and short conclusion
Die Kontributionen von dieser Dissertation haben ein doppeltes Ziel: die Darstellung von vielen multivariaten statistischen Verfahren, wobei in der Mehrheit der Fälle nur Stationarität von den Zufallsvariablen angenommen wurde, und die Anwendungen in Risikomanagement in welchem Extremwerttheorie eine wichtige Rolle spielen könnte. Die Struktur der Arbeit ist eigenständig, mit einer detaillierten Einführung und kurzen Zusammenfassung in jedem Kapitel
APA, Harvard, Vancouver, ISO, and other styles
48

Keefe, Matthew James. "Statistical Monitoring and Modeling for Spatial Processes." Diss., Virginia Tech, 2017. http://hdl.handle.net/10919/76664.

Full text
Abstract:
Statistical process monitoring and hierarchical Bayesian modeling are two ways to learn more about processes of interest. In this work, we consider two main components: risk-adjusted monitoring and Bayesian hierarchical models for spatial data. Usually, if prior information about a process is known, it is important to incorporate this into the monitoring scheme. For example, when monitoring 30-day mortality rates after surgery, the pre-operative risk of patients based on health characteristics is often an indicator of how likely the surgery is to succeed. In these cases, risk-adjusted monitoring techniques are used. In this work, the practical limitations of the traditional implementation of risk-adjusted monitoring methods are discussed and an improved implementation is proposed. A method to perform spatial risk-adjustment based on exact locations of concurrent observations to account for spatial dependence is also described. Furthermore, the development of objective priors for fully Bayesian hierarchical models for areal data is explored for Gaussian responses. Collectively, these statistical methods serve as analytic tools to better monitor and model spatial processes.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
49

Yener, Tina. "Risk management beyond correlation." Diss., lmu, 2011. http://nbn-resolving.de/urn:nbn:de:bvb:19-142730.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Manasse, Paul Reuben. "Time-dependent stochastic models for fire risk assessment." Thesis, University of Liverpool, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.317171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography