Tesis sobre el tema "Distribution généralisée de Pareto"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores tesis para su investigación sobre el tema "Distribution généralisée de Pareto".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Maynadier, Jérôme. "Approches statistiques et fiabilités en dynamique des structures". Toulouse, INSA, 2003. http://www.theses.fr/2003ISAT0017.
Texto completoThe improvement of the cyclic symmetry structures in turboshaft engines requires an accurate valuation of extreme vibrations which are reaching by these components. The amplitudes of the response of cyclic symmetry structures vary significantly in function of small perturbations named "mistuning". In general, mistunings are random parameters. Usually their effects on the vibration amplitudes are estimated from the experience of each motorist. Hence, at the present time, they are verified with the help of experiences by installation of strain gauges on pieces. To anticipate the evolutions of technologies the numerical approaches are necessary. In structure dynamics, the classical approach used to estimate the probability to reach a vibratory amplitude is the Monte Carlo method, efficient to the biggest probabilities, but extremely expensive when probabilities decrease. The most critical vibration amplitudes corresponding to the lowest probabilities, the probabilistic methods FORM and SORM are first considered. We develop then an original method named "separated variables method". Finally, a statistical approach by extreme values distribution on threshold overstepping with a Pareto law is kept to predict the queue of the distribution of the maximal amplitude of the forced responses. This law bases on a minimum quantities of simulations. After the validation of these different approaches on academic examples, the most efficient one are applied on industrial cases. We consider a cyclic symmetric structure modelled by a reduced model. This type of simplified modelization is able to represent the greatest part of configurations met when running
Fillon, Blandine. "Développement d'un outil statistique pour évaluer les charges maximales subies par l'isolation d'une cuve de méthanier au cours de sa période d'exploitation". Thesis, Poitiers, 2014. http://www.theses.fr/2014POIT2337/document.
Texto completoThis thesis focuses on statistical tools for the assessment of maxima sloshing loads in LNG tanks. According to ship features, tank cargo and sailing conditions, a sloshing phenomenon is observed inside LNG tanks. The determination of sloshing loads supported by the tank structure is derived from impact pressure measurements performed on a test rig. Pressure maxima per impact, extracted from test measurements, are investigated. Test duration is equivalent to 5 hours in full scale. This duration is not sufficient to determine pressure maxima associated with high return periods (40 years). It is necessary to use a probabilistic model in order to extrapolate pressure maxima. Usually, a Weibull model is used. As we focus on extreme values from samples, fittings are also performed with the generalized extreme value distribution and the generalized Pareto distribution using block maximum method and peaks over threshold method.The originality of this work is based on the use of an alternate measurement system which is more relevant than usual measurement system to get pressure maxima and a 480 hours measured data available for same test conditions. This provides a reference distribution for pressure maxima which is used to assess the relevance of the selected probabilistic models. Particular attention is paid to the assessment of fittings quality using statistical tests and to the quantification of uncertainties on estimated values.The provided methodology has been implemented in a software called Stat_R which makes the manipulation and the treatment of results easier
Tencaliec, Patricia. "Developments in statistics applied to hydrometeorology : imputation of streamflow data and semiparametric precipitation modeling". Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAM006/document.
Texto completoPrecipitation and streamflow are the two most important meteorological and hydrological variables when analyzing river watersheds. They provide fundamental insights for water resources management, design, or planning, such as urban water supplies, hydropower, forecast of flood or droughts events, or irrigation systems for agriculture.In this PhD thesis we approach two different problems. The first one originates from the study of observed streamflow data. In order to properly characterize the overall behavior of a watershed, long datasets spanning tens of years are needed. However, the quality of the measurement dataset decreases the further we go back in time, and blocks of data of different lengths are missing from the dataset. These missing intervals represent a loss of information and can cause erroneous summary data interpretation or unreliable scientific analysis.The method that we propose for approaching the problem of streamflow imputation is based on dynamic regression models (DRMs), more specifically, a multiple linear regression with ARIMA residual modeling. Unlike previous studies that address either the inclusion of multiple explanatory variables or the modeling of the residuals from a simple linear regression, the use of DRMs allows to take into account both aspects. We apply this method for reconstructing the data of eight stations situated in the Durance watershed in the south-east of France, each containing daily streamflow measurements over a period of 107 years. By applying the proposed method, we manage to reconstruct the data without making use of additional variables, like other models require. We compare the results of our model with the ones obtained from a complex approach based on analogs coupled to a hydrological model and a nearest-neighbor approach, respectively. In the majority of cases, DRMs show an increased performance when reconstructing missing values blocks of various lengths, in some of the cases ranging up to 20 years.The second problem that we approach in this PhD thesis addresses the statistical modeling of precipitation amounts. The research area regarding this topic is currently very active as the distribution of precipitation is a heavy-tailed one, and at the moment, there is no general method for modeling the entire range of data with high performance. Recently, in order to propose a method that models the full-range precipitation amounts, a new class of distribution called extended generalized Pareto distribution (EGPD) was introduced, specifically with focus on the EGPD models based on parametric families. These models provide an improved performance when compared to previously proposed distributions, however, they lack flexibility in modeling the bulk of the distribution. We want to improve, through, this aspect by proposing in the second part of the thesis, two new models relying on semiparametric methods.The first method that we develop is the transformed kernel estimator based on the EGPD transformation. That is, we propose an estimator obtained by, first, transforming the data with the EGPD cdf, and then, estimating the density of the transformed data by applying a nonparametric kernel density estimator. We compare the results of the proposed method with the ones obtained by applying EGPD on several simulated scenarios, as well as on two precipitation datasets from south-east of France. The results show that the proposed method behaves better than parametric EGPD, the MIAE of the density being in all the cases almost twice as small.A second approach consists of a new model from the general EGPD class, i.e., we consider a semiparametric EGPD based on Bernstein polynomials, more specifically, we use a sparse mixture of beta densities. Once again, we compare our results with the ones obtained by EGPD on both simulated and real datasets. As before, the MIAE of the density is considerably reduced, this effect being even more obvious as the sample size increases
Leblanc, Jean-Philippe. "Distribution hyperbolique généralisée et applications financières". Mémoire, Université de Sherbrooke, 2003. http://savoirs.usherbrooke.ca/handle/11143/2360.
Texto completoDiamoutene, Abdoulaye. "Contribution de la Théorie des Valeurs Extrêmes à la gestion et à la santé des systèmes". Thesis, Toulouse, INPT, 2018. http://www.theses.fr/2018INPT0139/document.
Texto completoThe operation of a system in general may at any time be affected by an unforeseen incident. When this incident has major consequences on the system integrity and the quality of system products, then it is said to be in the context of extreme events. Thus, increasingly researchers have a particular interest in modeling such events with studies on the reliability of systems and the prediction of the different risks that can hinder the proper functioning of a system. This thesis takes place in this very perspective. We use Extreme Value Theory (EVT) and extreme order statistics as a decision support tool in modeling and risk management in industry and aviation. Specifically, we model the surface roughness of machined parts and the reliability of the associated cutting tool with the extreme order statistics. We also did a modeling using the "Peaks-Over Threshold, POT" approach to make predictions about the potential victims in the American General Aviation (AGA) following extreme accidents. In addition, the modeling of systems subjected to environmental factors or covariates is most often carried out by proportional hazard models based on the hazard function. In proportional hazard models, the baseline risk function is typically Weibull distribution, which is a monotonic function. The analysis of the operation of some systems like the cutting tool in the industry has shown that a system can deteriorated on one phase and improving on the next phase. Hence, some modifications have been made in the Weibull distribution in order to have non-monotonic basic risk functions, more specifically, the increasing-decreasing risk function. Despite these changes, taking into account extreme operating conditions and overestimating risks are problematics. We have therefore proposed from Gumbel's standard distribution, an increasingdecreasing risk function to take into account extreme conditions, and established mathematical proofs. Furthermore, an example of the application in the field of industry was proposed. This thesis is organized in four chapters and to this must be added a general introduction and a general conclusion. In the first chapter, we recall some basic notions about the Extreme Values Theory. The second chapter focuses on the basic concepts of survival analysis, particularly those relating to reliability analysis by proposing a function of increasing-decreasing hazard function in the proportional hazard model. Regarding the third chapter, it deals with the use of extreme order statistics in industry, particularly in the detection of defective parts, the reliability of the cutting tool and the modeling of the best roughness surfaces. The last chapter focuses on the prediction of potential victims in AGA from historical data using the Peaks-Over Threshold approach
Lin, Der-Chen. "Parameter Estimation for Generalized Pareto Distribution". DigitalCommons@USU, 1988. https://digitalcommons.usu.edu/etd/6974.
Texto completoAnabila, Moses A. "Skew Pareto distributions". abstract and full text PDF (free order & download UNR users only), 2008. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1453191.
Texto completoSavulytė, Vaida. "Dvimačių Pareto dydžių maksimumų asimptotinė analizė". Master's thesis, Lithuanian Academic Libraries Network (LABT), 2007. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2007~D_20070816_142229-68037.
Texto completoThe aim of this paper is to construct two-dimensional random variables, having one-dimensional ones, carry out the asymptotical analysis and study the speed of convergence. Two-dimensional distribution is constructed in two ways: when the components of random variables are independent and dependent. As in the last few years Pareto distribution is popular in financial models, it was chosen for the analyses. It was proved, that in both cases of independent and dependent components of the vector, the limit distribution is the same. This means that although the components of the vector are dependent, the maxima are asymptotically independent. Besides, the errors are smaller than the approximate estimate. Although, the approximate estimate in the case of independent components is smaller than in the case of dependent components, the errors are on the contrary: they are smaller when the components are dependent than when the components are independent.
Kondlo, Lwando Orbet. "Estimation of Pareto distribution functions from samples contaminated by measurement errors". Thesis, University of the Western Cape, 2010. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_6141_1297831463.
Texto completoThe intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo
s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.
Sellami, Sami. "Comportements hydrodynamiques d'un modèle non gradient : l'exclusion simple généralisée". Rouen, 1998. http://www.theses.fr/1998ROUES083.
Texto completoThis thesis is constituted by two parts. In the first one, we study the equilibrium density fluctuation field of a one-dimensional reversible nongradient model. We prove, for the generalized exclusion process, the Boltzmann-Gibbs principle. This principle, first introduced by Brox and Rost, is the basic stage which enables us to show afterwards that our process converges in law to a generalized Ornstein-Uhlenbeck process, by applying Holley and Stroock's theory. In the second part, made in collaboration with C. Landim and M. Mourragui, we consider a nonlinear parabolic equation on a square with boundary conditions. Assuming that the diffusion coefficient is Lipschitz, we prove that the rescaled density field converges to the unique weak solution of the parabolic equation
Pelletier, François. "Modélisation des rendements financiers à l'aide de la distribution de Laplace asymétrique généralisée". Thesis, Université Laval, 2014. http://www.theses.ulaval.ca/2014/30557/30557.pdf.
Texto completoClassical models in finance are based on a set of hypotheses that are not always empirically verifiable. The main objective behind this master’s thesis is to show that the generalized asymmetric Laplace distribution is an interesting alternative to those models. To support this idea, we focus on some of its properties to develop parametric estimation, approximation and testing methods, then we work out some principles of derivatives pricing. Finally, we have a numerical example to illustrate these concepts.
Juozulynaitė, Gintarė. "Pareto atsitiktinių dydžių geometrinis maks stabilumas". Master's thesis, Lithuanian Academic Libraries Network (LABT), 2010. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20100830_094813-81556.
Texto completoIn this work I analyzed geometric max stability of univariate and bivariate Pareto random variables. I have proved, that univariate Pareto distribution is geometrically max stable when alpha=1. But it is not geometrically max stable when alpha unequal 1. Using the criterion of geometric max stability for bivariate Pareto random variables, I have proved, that bivariate Pareto distribution function is not geometrically max stable, when vectors’ components are independent (when alpha=1, beta=1 and alpha unequal 1, beta unequal 1). Also bivariate Pareto distribution function is not geometrically max stable, when vectors’ components are dependent (when alpha=1, beta=1 and alpha unequal 1, beta unequal 1). Research of bivariate Pareto distributions submitted unexpected results. Bivariate Pareto distribution function is not geometrically max stable, when alpha=1, beta=1. But marginal Pareto distribution functions are geometrically max stable, when alpha=1, beta=1.
Kamanina, Polina. "On the Invariance of Size Distribution of Establishments". Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Nationalekonomi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-18532.
Texto completoHernandez, Javiera I. "Does the Pareto Distribution of Hurricane Damage Inherit its Fat Tail from a Zipf Distribution of Assets at Hazard?" FIU Digital Commons, 2014. http://digitalcommons.fiu.edu/etd/1488.
Texto completoChamberlain, Lauren. "The Power Law Distribution of Agricultural Land Size". DigitalCommons@USU, 2018. https://digitalcommons.usu.edu/etd/7400.
Texto completoHöchstötter, Markus. "The pareto stable distribution as a hypothesis for returns of stocks listed in the DAX /". Hamburg : Kovač, 2006. http://www.verlagdrkovac.de/3-8300-2491-6.htm.
Texto completoOzdem, Mehmet. "Video Distribution Over Ip Networks". Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/12608187/index.pdf.
Texto completoZha, Yuanyuan, Tian-Chyi J. Yeh, Walter A. Illman, Hironori Onoe, Chin Man W. Mok, Jet-Chau Wen, Shao-Yang Huang y Wenke Wang. "Incorporating geologic information into hydraulic tomography: A general framework based on geostatistical approach". AMER GEOPHYSICAL UNION, 2017. http://hdl.handle.net/10150/624351.
Texto completoGiesen, Kristian [Verfasser], Jens [Akademischer Betreuer] Südekum y Joachim [Akademischer Betreuer] Prinz. "Zipf's Law for Cities and the Double-Pareto-Lognormal Distribution / Kristian Giesen. Gutachter: Joachim Prinz. Betreuer: Jens Südekum". Duisburg, 2012. http://d-nb.info/1024851893/34.
Texto completoBorisevič, Jelena. "Apie geometriškai stabiliuosius maksimumo skirstinius". Master's thesis, Lithuanian Academic Libraries Network (LABT), 2004. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2004~D_20040603_162439-78402.
Texto completoOchoa, Pizzali Luis Fernando [UNESP]. "Desempenho de redes de distribuição com geradores distribuídos". Universidade Estadual Paulista (UNESP), 2006. http://hdl.handle.net/11449/100362.
Texto completoConselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Alban
Fundação de Ensino Pesquisa e Extensão de Ilha Solteira (FEPISA)
Neste trabalho, é apresentada uma análise em regime permanente que considera a avaliação de impactos técnicos tais como perdas elétricas, queda de tensão e níveis de curto-circuito, entre outros; utilizando dados de demanda e geração variáveis no tempo ao longo de um horizonte determinado. O objetivo é encontrar um conjunto de arranjos da GD (configurações) que levem ao melhor desempenho da rede de distribuição analisada, minimizando ou maximizando cada aspecto técnico segundo o interesse da empresa de distribuição. Dada a natureza combinatória deste problema, que requer uma ferramenta de otimização capaz de manipular múltiplos objetivos, os impactos técnicos serão avaliados simultaneamente utilizando uma metodologia baseada no conceito do Non-dominated Sorting Genetic Algorithm (NSGA), conduzindo a soluções mais reais e diversificadas para a tomada de decisões, conhecidas como soluções ótimas de Pareto.
In this work a steady-state analysis considering the assessment of technical impacts such as losses, voltage drop and short-circuit levels, among others; utilizing time-variant loads and generation within a specified horizon. The objective is to find a set of configurations that lead to the best performance of the distribution network under analysis, minimizing or maximizing each technical aspect according to the utility's concerns. Given the combinatorial nature of this problem, which requires an optimization tool able to handle multiple objectives, technical impacts will be assessed simultaneously through a methodology based on the non-dominated sorting genetic algorithm (NSGA). This approach leads to a more realistic and diversified set of solutions for taking decisions, known as Pareto-optimal solutions.
Nikolaou, Christos. "A multi-objective genetic algorithm optimisation using variable speed pumps in water distribution systems". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/6819/.
Texto completoKeating, Karen. "Statistical analysis of pyrosequence data". Diss., Kansas State University, 2012. http://hdl.handle.net/2097/14026.
Texto completoDepartment of Statistics
Gary L. Gadbury
Since their commercial introduction in 2005, DNA sequencing technologies have become widely available and are now cost-effective tools for determining the genetic characteristics of organisms. While the biomedical applications of DNA sequencing are apparent, these technologies have been applied to many other research areas. One such area is community ecology, in which DNA sequence data are used to identify the presence and abundance of microscopic organisms that inhabit an environment. This is currently an active area of research, since it is generally believed that a change in the composition of microscopic species in a geographic area may signal a change in the overall health of the environment. An overview of DNA pyrosequencing, as implemented by the Roche/Life Science 454 platform, is presented and aspects of the process that can introduce variability in data are identified. Four ecological data sets that were generated by the 454 platform are used for illustration. Characteristics of these data include high dimensionality, a large proportion of zeros (usually in excess of 90%), and nonzero values that are strongly right-skewed. A nonparametric method to standardize these data is presented and effects of standardization on outliers and skewness are examined. Traditional statistical methods for analyzing macroscopic species abundance data are discussed, and the applicability of these methods to microscopic species data is examined. One objective that receives focus is the classification of microscopic species as either rare or common species. This is an important distinction since there is much evidence to suggest that the biological and environmental mechanisms that govern common species are distinctly different than the mechanisms that govern rare species. This indicates that the abundance patterns for common and rare species may follow different probability models, and the suitability of the Pareto distribution for rare species is examined. Techniques for classifying macroscopic species are shown to be ill-suited for microscopic species, and an alternative technique is presented. Recognizing that the structure of the data is similar to that of financial applications (such as insurance claims and the distribution of wealth), the Gini index and other statistics based on the Lorenz curve are explored as potential test statistics for distinguishing rare versus common species.
Orellana, Aragón Jorge Alberto. "A lei de Zipf e os efeitos de um tratado de livre comércio : caso da Guatemala". reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2009. http://hdl.handle.net/10183/16417.
Texto completoOver the last 50 years, in Central America was developed one of the oldest processes of economic and regional integration of the American Continent. Since the establishment in 1960 of the Central American Common Market (CACM), intra-regional trade significantly increased under multilateral, bilateral and regional free trade agreements of the integration process. Today, a new perspective exists in the study of the effects of international trade offered by the New Economic Geography (NEG) that seeks to explain the evolution and distribution of the size of the cities that can be represented by Pareto's distribution, derived from a well-known empirical regularity known as the Zipf's Law, which promotes an explanation of how the agglomeration forces in the urban centers interact in favor of economic activity and international trade. This dissertation tries to investigate the way in which the changes in trade policy generate changes in the order of the size in the cities, thus influencing the economic growth of Guatemala. To this purpose Pareto's coefficient was estimated for the period between 1921 and 2002 and it was considered as an aggregated value and therefore the original proposal of two not-linealities were introduced in the distribution as support, as the Hirschman-Herfindahl Index to measure the degree of the urban concentration. On the other hand, a model of variation rates was used during the 1960 and 2002 period to measure the trade impact of the trade opening on the resulting economic growth. Therefore, a model of variation rates was used to measure the impact of the trade opening on the resulting economic growth during the 1960-2002 period. For that reason, it is possible to emphasize the alterations in the size of the sample that can achieve different interpretations. The results obtained point to a slight growth in inequality and divergence, even though the index of urban concentration shows a gradual fall from 1964 during the CACM period up to 2002; which otherwise means that small cities grew at a smaller rate than the larger cities did. In the case of the 1973-2002 period, it is possible to verify Gibrat's Law which indicates that the growth of the cities is independent to its size. Also the hypothesis is verified that the urban concentration has an inverse relation with the trade opening and that the urban concentration is correlated in a positive form with the economic growth during the 1921-1964 period. With these results it is possible to show the future way of the evolution of urban growth where major cities would reduce its growth, and the middle and small cities will grow further at a more accelerated rate than the major cities driven by the growth of international trade.
En los últimos 50 años, se registró en Centro América uno de los procesos de integración económica y regional más antiguos del continente. El comercio intra-regional aumento y se dinamizo significativamente a partir de la formación, en 1960, del Mercado Común Centroamericano (MCCA), así como a los procesos de integración como acuerdos bilaterales, regionales y multilaterales de libre comercio. A partir de esos acuerdos, surge una nueva perspectiva para estudiar los efectos del comercio internacional, la Nueva Geografía Económica (NGE) la cual intenta explicar como la evolución de la distribución del tamaño de las ciudades puede ser representada por una distribución de Pareto, que se deriva en una regularidad empírica llamada la Ley de Zipf, que brinda una explicación de como interactúan las fuerzas de aglomeración en los centros urbanos y que favorecen a la actividad económica en el comercio internacional en general. Esta disertación busca investigar como los cambios en la política comercial generaran un impacto sobre el orden en el tamaño de las ciudades y esto a su vez como influencia en el crecimiento económico de Guatemala. Para ese propósito, fue estimado el coeficiente de Pareto en el período comprendido entre 1921-2002 y como un valor agregado en la propuesta original, fueran introducidas dos no-linealidades en la distribución y una medida de apoyo, como el Índice Hirschman-Herfindahl, para medir el grado de concentración urbana. Por otra parte, fue utilizado un modelo de tasas de variación para medir el impacto de apertura comercial en el período de 1960-2002 sobre el crecimiento económico resultante. Por lo tanto, se puede enfatizar que alteraciones en el tamaño de la muestra pueden conducir a diferentes interpretaciones. Los resultados obtenidos apuntan un leve crecimiento en la desigualdad y divergencia, a pesar de que el índice de concentración urbana muestra una caída gradual desde el año de 1964, en la época del MCCA, hasta el año de 2002. En el caso del período de 1973-2002, se puede verificar la Ley de Gibrat, que indica que el crecimiento de las ciudades es independiente de su tamaño. También se verifica la hipótesis de que la concentración urbana tiene una relación inversa con una apertura comercial y que está correlacionada de forma positiva con el crecimiento económico en el período de 1921-1964. Con estos resultados, se puede mostrar el camino futuro de la evolución del crecimiento urbano, donde las mayores ciudades reducirían su crecimiento y las medianas y pequeñas ciudades crecerán a un ritmo más acelerado que los grandes centros, impulsadas por el crecimiento del comercio internacional.
Silva, Renato Rodrigues. "A distribuição generalizada de Pareto e mistura de distribuições de Gumbel no estudo da vazão e da velocidade máxima do vento em Piracicaba, SP". Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/11/11134/tde-18112008-145737/.
Texto completoThe extreme value theory is a probability topics that describes the asymtoptic distribution of order statistics such as maximum or minimum of random variables sequence that follow a distribution function F normaly unknown. Describes still, the excess asymtoptic distribution over threshold of this sequence. So, the standard methodologies of extremes values analysis are the fitting of generalized extreme value distribution to yearly maximum series or the fitting of generalized Pareto distribution to partial duration series. However, according to Coles et al. (2003), there is a growing dissatisfaction with the use this standard models for the prediction of extremes events and one of possible causes this fact may be a false assumptions about a sequence of observed data as a independence assumptions or because the standards models must not used in some specific situations like for example when maximum sample arise from two or more independents populations, where the first population describes more frequents and low intense events and the second population describes less frequents and more intense events. In this way, the two articles this work has a objective show alternatives about extreme values analysis for this situations that the standards models doesn´t recommended. In the first article, the generalized distribution Pareto and exponencial distribution, particular case of GP, together with to declustering methods was applied to mean daily flow of the Piracicaba river, Artemis station, Piracicaba, SP, and the estimates the return levels of 5, 10, 50 and 100 years were compared. We conclude that the interval estimates of the 50 and 100 year return levels obtained using the fitting the exponencial distribution are more precise than those obtained using the generalized Pareto distribution. In the second article, we propose the fit of Gumbel distribution and the Gumbel mixture to data maximum speed wind in Piracicaba, SP. We select the best model using bootstrap test of hypotheses and the AIC and BIC selection criteria We conclude that the mixture Gumbel is the best model to analyze the maximum wind speed data for months of april e may and otherside the fit of Gumbel distributions was the best fit to months of august e september.
Engberg, Alexander. "An empirical comparison of extreme value modelling procedures for the estimation of high quantiles". Thesis, Uppsala universitet, Statistiska institutionen, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297063.
Texto completoCascão, Fernando Miguel Laires. "Regressão do índice de cauda : uma aplicação empírica". Master's thesis, Instituto Superior de Economia e Gestão, 2018. http://hdl.handle.net/10400.5/16662.
Texto completoNo presente trabalho é apresentada uma metodologia de estimação do índice de cauda, que assenta numa regressão exponencial do parâmetro função de variáveis explicativas. O método de estimação é o de Quase Máxima Verosimilhança baseada na função log-verosimilhança de Pareto de tipo I. A metodologia em estudo é aplicada às observações do prémio de risco do mercado acionista. Neste sentido, pretende-se explicar os valores extremos da aba esquerda da distribuição dos dados, com recurso a um conjunto de variáveis estudadas na literatura, no contexto do mercado de ações. Os resultados sugerem que as variáveis mais relevantes para explicar a variável de interesse são regressores que representam situações de crise e incerteza social, política e económica, para cada momento de tempo. Os resultados finais indicam que o prémio de risco tem uma massa de probabilidade considerável associada a valores extremos da série.
In the present work a tail index estimation methodology is presented, which is based on an exponential regression of the parameter function of explanatory variables. The estimation method is the Quasi Maximum Likelihood based on the Pareto log-likelihood function of type I. The methodology under study is applied to the observations of the risk premium of the stock market. In this sense, it is intended to explain the extreme values of the left-hand tail of the data distribution, using a set of variables studied in the literature, in the context of the stock market. The results suggest that the most relevant variables to explain the variable of interest are regressors that represent situations of crisis and social, political and economic uncertainty, for each moment of time. The final results indicate that the risk premium has a considerable probability mass associated with extreme values of the series.
info:eu-repo/semantics/publishedVersion
Manas, Arnaud. "Essais sur le club de Paris, la loi de Gibrat et l'histoire de la Banque de France". Thesis, Aix-Marseille, 2013. http://www.theses.fr/2013AIXM1136/document.
Texto completoThis dissertation is made of several papers published between 2005 and 2012 and somme working papers. The first part deals with the Paris Club. Two papers published in the Bulletin of the Banque de France deal with the very indebted countries and debt buybacks ( Pricing the implicit contracts in the Paris Club debt buybacks). The second axis is oriented on the Gibrat's law (French butchers don't do Quantum Physics in Economics Letters, Vol. 103, May 2009, Pp. 101-106 ; The Paretian Ratio Distribution - An application to the volatility of GDP in Economics Letters, Vol. 111, May 2011, pp. 180-183 ; The Laplace Illusion in Physica A, Vol. 391, August 2012, pp. 3963–3970). The third axis deals with the history of the Banque de France
Bogner, Christina. "Analysis of flow patterns and flow mechanisms in soils". Thesis, Bayreuth Bayreuth Center of Ecology and Environmental Research, 2009. http://d-nb.info/997214058/34.
Texto completoSANTOS, Rosilda Sousa. "Estudo sobre algumas famílias de distribuições de probabilidades generalizadas". Universidade Federal de Campina Grande, 2012. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/1358.
Texto completoMade available in DSpace on 2018-08-06T14:18:54Z (GMT). No. of bitstreams: 1 ROSILDA SOUSA SANTOS - DISSERTAÇÃO PPGMAT 2012..pdf: 864926 bytes, checksum: 9d85b58c8bca6174ef968354411068a1 (MD5) Previous issue date: 2012-09
Capes
A proposta desta dissertação está relacionada com o estudo das principais famílias de distribuições de probabilidade generalizadas. Particularmente, estudamos as distribuições Beta Pareto, Beta Exponencial Generalizada, Beta Weibull Modificada, Beta Fréchet e a Kw-G. Para cada uma delas foram obtidas expressões para as funções densidades de probabilidade, funcões de distribuição acumuladas, funções de taxa de falha, funções geratrizes de momentos, bem como foram obtidos os estimadores dos parâmetros pelo método da máxima verossimilhança. Finalmente, para cada distribuição foram feitas aplicações com dados reais.
The purpose of this dissertation is to study the main families of generalized probability distributions. Particularly we study the distributions Beta Pareto, generalized Beta Exponential, Beta Modified Weibull, Beta Fréchet and Kw-G. For each one of these distributions we obtain expressions for the probability density function, cumulative distribution function, hazard function and moment generating function as well as parameter estimates by the method of maximum likelihood. Finally, we make real data applications for each one of the studied distributions.
Halimi, Abdelghafour. "Modélisation et traitement statistique d'images de microscopie confocale : application en dermatologie". Phd thesis, Toulouse, INPT, 2017. http://oatao.univ-toulouse.fr/19515/1/HALIMI_Abdleghafour.pdf.
Texto completoOchoa, Pizzali Luis Fernando. "Desempenho de redes de distribuição com geradores distribuídos /". Ilha Solteira : [s.n.], 2006. http://hdl.handle.net/11449/100362.
Texto completoBanca: Rubén Augusto Romero Lázaro
Banca: Dionízio Paschoareli Júnior
Banca: Gareth Harrison
Banca: Carmen Lucia Tancredo Borges
Resumo: Neste trabalho, é apresentada uma análise em regime permanente que considera a avaliação de impactos técnicos tais como perdas elétricas, queda de tensão e níveis de curto-circuito, entre outros; utilizando dados de demanda e geração variáveis no tempo ao longo de um horizonte determinado. O objetivo é encontrar um conjunto de arranjos da GD (configurações) que levem ao melhor desempenho da rede de distribuição analisada, minimizando ou maximizando cada aspecto técnico segundo o interesse da empresa de distribuição. Dada a natureza combinatória deste problema, que requer uma ferramenta de otimização capaz de manipular múltiplos objetivos, os impactos técnicos serão avaliados simultaneamente utilizando uma metodologia baseada no conceito do Non-dominated Sorting Genetic Algorithm (NSGA), conduzindo a soluções mais reais e diversificadas para a tomada de decisões, conhecidas como soluções ótimas de Pareto.
Abstract: In this work a steady-state analysis considering the assessment of technical impacts such as losses, voltage drop and short-circuit levels, among others; utilizing time-variant loads and generation within a specified horizon. The objective is to find a set of configurations that lead to the best performance of the distribution network under analysis, minimizing or maximizing each technical aspect according to the utility's concerns. Given the combinatorial nature of this problem, which requires an optimization tool able to handle multiple objectives, technical impacts will be assessed simultaneously through a methodology based on the non-dominated sorting genetic algorithm (NSGA). This approach leads to a more realistic and diversified set of solutions for taking decisions, known as Pareto-optimal solutions.
Doutor
Hitz, Adrien. "Modelling of extremes". Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:ad32f298-b140-4aae-b50e-931259714085.
Texto completoBlanchet, Thomas. "Essays on the Distribution of Income and Wealth : Methods, Estimates and Theory". Thesis, Paris, EHESS, 2020. http://www.theses.fr/2020EHES0004.
Texto completoThis thesis covers several topics on the distribution of income and wealth. In the first chapter, we develop a new methodology to exploit tabulations of income and wealth such as the one published by tax authorities. In it, we define generalized Pareto curves as the curve of inverted Pareto coefficients b(p), where b(p) is the ratio between average income or wealth above rank p and the p-th quantile Q(p) (i.e. b(p)=E[X|X>Q(p)]/Q(p)). We use them to characterize entire distributions, including places like the top where power laws are a good description, and places further down where they are not. We develop a method to flexibly recover the entire distribution based on tabulated income or wealth data which produces smooth and realistic shapes of generalized Pareto curves.In the second chapter, we present a new approach to combine survey data with tax tabulations to correct for the underrepresentation of the rich at the top. It endogenously determines a "merging point'' between the datasets before modifying weights along the entire distribution and replacing new observations beyond the survey's original support. We provide simulations of the method and applications to real data. The former demonstrate that our method improves the accuracy and precision of distributional estimates, even under extreme assumptions, and in comparison to other survey correction methods using external data. The empirical applications show that not only can income inequality levels change, but also trends.In the third chapter, we estimate the distribution of national income in thirty-eight European countries between 1980 and 2017 by combining surveys, tax data and national accounts. We develop a unified methodology combining machine learning, nonlinear survey calibration and extreme value theory in order to produce estimates of pre-tax and post-tax income inequality, comparable across countries and consistent with macroeconomic growth rates. We find that inequality has increased in a majority of European countries, especially between 1980 and 2000. The European top 1% grew more than two times faster than the bottom 50% and captured 18% of regional income growth.In the fourth chapter, I decompose the dynamics of the wealth distribution using a simple dynamic stochastic model that separates the effects of consumption, labor income, rates of return, growth, demographics and inheritance. Based on two results of stochastic calculus, I show that this model is nonparametrically identified and can be estimated using only repeated cross-sections of the data. I estimate it using distributional national accounts for the United States since 1962. I find that, out of the 15pp. increase in the top 1% wealth share observed since 1980, about 7pp. can be attributed to rising labor income inequality, 6pp. to rising returns on wealth (mostly in the form of capital gains), and 2pp. to lower growth. Under current parameters, the top 1% wealth share would reach its steady-state value of roughly 45% by the 2040s, a level similar to that of the beginning of the 20th century. I then use the model to analyze the effect of progressive wealth taxation at the top of the distribution
Park, JinSoo. "Adaptive Asymmetric Slot Allocation for Heterogeneous Traffic in WCDMA/TDD Systems". Diss., Virginia Tech, 2004. http://hdl.handle.net/10919/29630.
Texto completoPh. D.
Mahadevan, Muralidharan Ananth. "Analysis of Garbage Collector Algorithms in Non-Volatile Memory Devices". The Ohio State University, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=osu1365811711.
Texto completoMaire, Anthony. "Comment sélectionner les zones prioritaires pour la conservation et la restauration des communautés de poissons de rivière ? Applications aux échelles de la France et du Pas-de-Calais". Phd thesis, Toulouse, INPT, 2014. http://oatao.univ-toulouse.fr/13313/1/Maire.pdf.
Texto completoHolešovský, Jan. "Metody odhadu parametrů rozdělení extrémního typu s aplikacemi". Doctoral thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2016. http://www.nusl.cz/ntk/nusl-240512.
Texto completoPerrin, Yohann. "Etude de la structure partonique de l'hélium". Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00845950.
Texto completoHo, Zhen Wai Olivier. "Contributions aux algorithmes stochastiques pour le Big Data et à la théorie des valeurs extrèmes multivariés". Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCD025/document.
Texto completoThis thesis in divided in two parts. The first part studies models for multivariate extremes. We give a method to construct multivariate regularly varying random vectors. The method is based on a multivariate extension of a Breiman Lemma that states that a product $RZ$ of a random non negative regularly varying variable $R$ and a non negative $Z$ sufficiently integrable is also regularly varying. Replacing $Z$ with a random vector $mathbf{Z}$, we show that the product $Rmathbf{Z}$ is regularly varying and we give a characterisation of its limit measure. Then, we show that taking specific distributions for $mathbf{Z}$, we obtain classical max-stable models. We extend our result to non-standard regular variations. Next, we show that the Pareto model associated with the Hüsler-Reiss max-stable model forms a full exponential family. We show some properties of this model and we give an algorithm for exact simulation. We study the properties of the maximum likelihood estimator. Then, we extend our model to non-standard regular variations. To finish the first part, we propose a numerical study of the Hüsler-Reiss Pareto model.In the second part, we start by giving a lower bound of the smallest singular value of a matrix perturbed by appending a column. Then, we give a greedy algorithm for feature selection and we illustrate this algorithm on a time series dataset. Secondly, we show that an incoherent matrix satisfies a weakened version of the NSP property. Thirdly, we study the problem of column selection of $Xinmathbb{R}^{n imes p}$ given a coherence threshold $mu$. This means we want the largest submatrix satisfying some coherence property. We formulate the problem as a linear program with quadratic constraint on ${0,1}^p$. Then, we consider a relaxation on the sphere and we bound the relaxation error. Finally, we study the projected stochastic gradient descent for online PCA. We show that in expectation, the algorithm converges to a leading eigenvector and we suggest an algorithm for step-size selection. We illustrate this algorithm with a numerical experiment
Nilsson, Mattias. "Tail Estimation for Large Insurance Claims, an Extreme Value Approach". Thesis, Linnaeus University, School of Computer Science, Physics and Mathematics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-7826.
Texto completoIn this thesis are extreme value theory used to estimate the probability that large insuranceclaims are exceeding a certain threshold. The expected claim size, given that the claimhas exceeded a certain limit, are also estimated. Two different models are used for thispurpose. The first model is based on maximum domain of attraction conditions. A Paretodistribution is used in the other model. Different graphical tools are used to check thevalidity for both models. Länsförsäkring Kronoberg has provided us with insurance datato perform the study.Conclusions, which have been drawn, are that both models seem to be valid and theresults from both models are essential equal.
I detta arbete används extremvärdesteori för att uppskatta sannolikheten att stora försäkringsskadoröverträffar en vis nivå. Även den förväntade storleken på skadan, givetatt skadan överstiger ett visst belopp, uppskattas. Två olika modeller används. Den förstamodellen bygger på antagandet att underliggande slumpvariabler tillhör maximat aven extremvärdesfördelning. I den andra modellen används en Pareto fördelning. Olikagrafiska verktyg används för att besluta om modellernas giltighet. För att kunna genomförastudien har Länsförsäkring Kronoberg ställt upp med försäkringsdata.Slutsatser som dras är att båda modellerna verkar vara giltiga och att resultaten ärlikvärdiga.
Santos, Jeferino Manuel dos. "Aplicação da Teoria de Valores Extremos à actividade seguradora". Master's thesis, Instituto Superior de Economia e Gestão, 2003. http://hdl.handle.net/10400.5/652.
Texto completoO objectivo principal deste trabalho é realçar a importância da Teoria de Valores Extremos na actividade seguradora. São apresentados de uma forma sucinta alguns dos principais resultados ligados a esta teoria. São apresentadas algumas estatísticas que possibilitam a simplificação do processo de reconhecimento de dados de cauda pesada. A modelação da cauda é um assunto de particular interesse, são apresentados dois métodos de modelação da cauda, um pelo ajustamento de uma distribuição de Pareto Generalizada, outro pela aplicação de um método semi-paramétrico adaptativo. No fim, os resultados obtidos por cada um dos modelos são integrados como módulo num modelo de solvência.
The main purpose of this dissertation is to enhance the importance of Extreme Value Theory in the insurance sector. A short introduction to the main results inherent in this theory is presented. Also, a set of statistics to simplify the recognition process of heavy tailed data is provided. Tail modelling is a subject of particular interest in this dissertation, two approaches are presented, one by fitting a Generalized Pareto Distribution, other by modelling by means of a semi-parametric adaptive method. In the last part, the results of these approaches are integrated as a module in a broader solvency model.
Dalne, Katja. "The Performance of Market Risk Models for Value at Risk and Expected Shortfall Backtesting : In the Light of the Fundamental Review of the Trading Book". Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-206168.
Texto completoDen globala finanskrisen som inleddes år 2007 ledde till flertalet ändringar vad gäller riskreglering för banker. En omfattande förändring som beräknas implementeras år 2019, utgörs av Fundamental Review of the Trading Book (FRTB). Denna föreslår bland annat användande av Expected Shortfall (ES) som riskmått istället för Value at Risk (VaR) som används idag, liksom tillämpandet av varierande likviditetshorisonter beroende på risknivåerna för tillgångarna i fråga. Den huvudsakliga svårigheten med att implementera FRTB ligger i backtestingen av ES. Righi och Ceretta föreslår ett robust ES backtest som baserar sig på Monte Carlo-simulering. Det är flexibelt i den mening att det inte antar någon specifik sannolikhetsfördelning samt att det går att implementera utan att man behöver vänta en hel backtestingperiod. Vid implementation av olika standardbacktest för VaR, liksom backtestet för ES av Righi och Ceretta, fås en uppfattning av vilka riskmåttsmodeller som ger de mest korrekta resultaten från både ett VaR- och ES-backtestingperspektiv. Sammanfattningsvis kan man konstatera att en modell som är acceptabel från ett VaR-backtestingperspektiv inte nödvändigtvis är det från ett ES-backtestingperspektiv och vice versa. I det hela taget har det visat sig att de modeller som är acceptabla ur ett VaR-backtestingperspektiv troligtvis är för konservativa från ett ESbacktestingperspektiv. Om man betraktar de konfidensnivåer som föreslagits i FRTB, kan man ur ett VaR-backtestingperspektiv konstatera att en riskmåttsmodell med normal-copula och en hybridfördelning med generaliserad Pareto-fördelning i svansarna och empirisk fördelning i centrum tillsammans med GARCH-filtrering är den bäst lämpade, medan det från ett ES-backtestingperspektiv är att föredra en riskmåttsmodell med univariat Student t-fördelning med ⱱ ≈ 7 tillsammans med GARCH-filtrering. Detta innebär att när banker ska implementera FRTB kommer de behöva kompromissa mellan att uppnå en bra VaR-modell som potentiellt resulterar i för konservativa ES-estimat och en modell som är mindre bra ur ett VaRperspektiv men som resulterar i rimligare ES-estimat. Examensarbetet genomfördes vid SAS Institute, ett amerikanskt IT-företag som bland annat utvecklar mjukvara för riskhantering. Tänkbara kunder är banker och andra finansinstitut. Denna studie av FRTB innebär en potentiell fördel för företaget vid kontakt med kunder som planerar implementera regelverket inom en snar framtid.
Riskhantering, finansiella tidsserier, Value at Risk, Expected Shortfall, Monte Carlo-simulering, GARCH-modellering, Copulas, hybrida distributioner, generaliserad Pareto-fördelning, extremvärdesteori, Backtesting, likviditetshorisonter, Basels regelverk
Saaidia, Noureddine. "Sur les familles des lois de fonction de hasard unimodale : applications en fiabilité et analyse de survie". Thesis, Bordeaux 1, 2013. http://www.theses.fr/2013BOR14794/document.
Texto completoIn reliability and survival analysis, distributions that have a unimodalor $\cap-$shape hazard rate function are not too many, they include: the inverse Gaussian,log-normal, log-logistic, Birnbaum-Saunders, exponential Weibull and power generalized Weibulldistributions. In this thesis, we develop the modified Chi-squared tests for these distributions,and we give a comparative study between the inverse Gaussian distribution and the otherdistributions, then we realize simulations. We also construct the AFT model based on the inverseGaussian distribution and redundant systems based on distributions having a unimodal hazard ratefunction
Winkler, Anderson M. "Widening the applicability of permutation inference". Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:ce166876-0aa3-449e-8496-f28bf189960c.
Texto completoSADEFO, KAMDEM Jules. "Méthodes analytiques pour le Risque des Portefeuilles Financiers". Phd thesis, Université de Reims - Champagne Ardenne, 2004. http://tel.archives-ouvertes.fr/tel-00009187.
Texto completoBouquiaux, Christel. "Semiparametric estimation for extreme values". Doctoral thesis, Universite Libre de Bruxelles, 2005. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210910.
Texto completoDoctorat en sciences, Orientation statistique
info:eu-repo/semantics/nonPublished
Brathwaite, Joy Danielle. "Value-informed space systems design and acquisition". Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/43748.
Texto completoIbn, Taarit Kaouther. "Contribution à l'identification des systèmes à retards et d'une classe de systèmes hybrides". Phd thesis, Ecole Centrale de Lille, 2010. http://tel.archives-ouvertes.fr/tel-00587336.
Texto completoČíž, Bronislav. "Progresivita daně z příjmů". Master's thesis, Vysoká škola ekonomická v Praze, 2008. http://www.nusl.cz/ntk/nusl-5231.
Texto completo