To see the other types of publications on this topic, follow the link: Peaks over threshold method.

Journal articles on the topic 'Peaks over threshold method'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Peaks over threshold method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Heckert, N. A., E. Simiu, and T. Whalen. "Estimates of Hurricane Wind Speeds by “Peaks Over Threshold” Method." Journal of Structural Engineering 124, no. 4 (1998): 445–49. http://dx.doi.org/10.1061/(asce)0733-9445(1998)124:4(445).

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

AL-Dhurafi, Nasr Ahmed, Nurulkamal Masseran, Zamira Hasanah Zamzuri, and Ahmad Mahir Razali. "Modeling Unhealthy Air Pollution Index Using a Peaks-Over-Threshold Method." Environmental Engineering Science 35, no. 2 (2018): 101–10. http://dx.doi.org/10.1089/ees.2017.0077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kim, S. Y., and J. Song. "Estimation of Car Insurance Loss Ratio Using the Peaks over Threshold Method." Korean Journal of Applied Statistics 25, no. 1 (2012): 101–14. http://dx.doi.org/10.5351/kjas.2012.25.1.101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ferreira, J. A., and C. Guedes Soares. "An Application of the Peaks Over Threshold Method to Predict Extremes of Significant Wave Height." Journal of Offshore Mechanics and Arctic Engineering 120, no. 3 (1998): 165–76. http://dx.doi.org/10.1115/1.2829537.

Full text
Abstract:
The paper describes an application of the Peaks Over Threshold (POT) method to significant wave height data of Figueira da Foz, Portugal. The method is briefly explained and justified. The exponential distribution is shown to be adequate for modeling the peaks of clustered excesses over a threshold of 6 m. Estimates of return values are given. The exponential character of the data is theoretically justified in the Appendix.
APA, Harvard, Vancouver, ISO, and other styles
5

Naess, A. "Statistical Extrapolation of Extreme Value Data Based on the Peaks Over Threshold Method." Journal of Offshore Mechanics and Arctic Engineering 120, no. 2 (1998): 91–96. http://dx.doi.org/10.1115/1.2829529.

Full text
Abstract:
This paper discusses the use of the peaks over threshold method for estimating long return period design values of environmental loads. Attention is focused on the results concerning the type of asymptotic extreme value distribution for use in the extrapolation to required design values obtained by such methods, which in many cases seem to indicate that the Weibull distribution for maxima is the appropriate one. It will be shown by a closer scrutiny of the underlying estimation process that very often such a conclusion cannot in fact be substantiated.
APA, Harvard, Vancouver, ISO, and other styles
6

Pandey, M. D. "Minimum cross-entropy method for extreme value estimation using peaks-over-threshold data." Structural Safety 23, no. 4 (2001): 345–63. http://dx.doi.org/10.1016/s0167-4730(02)00008-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kyselý, Jan, Jan Picek, and Romana Beranová. "Estimating extremes in climate change simulations using the peaks-over-threshold method with a non-stationary threshold." Global and Planetary Change 72, no. 1-2 (2010): 55–68. http://dx.doi.org/10.1016/j.gloplacha.2010.03.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bezak, Nejc, Mitja Brilly, and Mojca Šraj. "Comparison between the peaks-over-threshold method and the annual maximum method for flood frequency analysis." Hydrological Sciences Journal 59, no. 5 (2014): 959–77. http://dx.doi.org/10.1080/02626667.2013.831174.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cerovic, Julija, and Vesna Karadzic. "Extreme value theory in emerging markets: Evidence from the Montenegrin stock exchange." Ekonomski anali 60, no. 206 (2015): 87–116. http://dx.doi.org/10.2298/eka1506087c.

Full text
Abstract:
The concept of Value at Risk(VaR) estimates the maximum loss of a financial position at a given time for a given probability. This paper considers the adequacy of the methods that are the basis of extreme value theory in the Montenegrin emerging market before and during the global financial crisis. In particular, the purpose of the paper is to investigate whether the peaks-over-threshold method outperforms the block maxima method in evaluation of Value at Risk in emerging stock markets such as the Montenegrin market. The daily return of the Montenegrin stock market index MONEX20 is analyzed for the period January 2004 - February 2014. Results of the Kupiec test show that the peaks-over-threshold method is significantly better than the block maxima method, but both methods fail to pass the Christoffersen independence test and joint test due to the lack of accuracy in exception clustering when measuring Value at Risk. Although better, the peaks-over-threshold method still cannot be treated as an accurate VaR model for the Montenegrin frontier stock market.
APA, Harvard, Vancouver, ISO, and other styles
10

Zhao, Xu, Zhongxian Zhang, Weihu Cheng, and Pengyue Zhang. "A New Parameter Estimator for the Generalized Pareto Distribution under the Peaks over Threshold Framework." Mathematics 7, no. 5 (2019): 406. http://dx.doi.org/10.3390/math7050406.

Full text
Abstract:
Techniques used to analyze exceedances over a high threshold are in great demand for research in economics, environmental science, and other fields. The generalized Pareto distribution (GPD) has been widely used to fit observations exceeding the tail threshold in the peaks over threshold (POT) framework. Parameter estimation and threshold selection are two critical issues for threshold-based GPD inference. In this work, we propose a new GPD-based estimation approach by combining the method of moments and likelihood moment techniques based on the least squares concept, in which the shape and scale parameters of the GPD can be simultaneously estimated. To analyze extreme data, the proposed approach estimates the parameters by minimizing the sum of squared deviations between the theoretical GPD function and its expectation. Additionally, we introduce a recently developed stopping rule to choose the suitable threshold above which the GPD asymptotically fits the exceedances. Simulation studies show that the proposed approach performs better or similar to existing approaches, in terms of bias and the mean square error, in estimating the shape parameter. In addition, the performance of three threshold selection procedures is assessed by estimating the value-at-risk (VaR) of the GPD. Finally, we illustrate the utilization of the proposed method by analyzing air pollution data. In this analysis, we also provide a detailed guide regarding threshold selection.
APA, Harvard, Vancouver, ISO, and other styles
11

Campbell, Bradley, Vadim Belenky, and Vladas Pipiras. "Application of the envelope peaks over threshold (EPOT) method for probabilistic assessment of dynamic stability." Ocean Engineering 120 (July 2016): 298–304. http://dx.doi.org/10.1016/j.oceaneng.2016.03.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Li, Jiqing, Jing Huang, Xuefeng Chu, and Jay R. Lund. "An Improved Peaks-Over-Threshold Method and its Application in the Time-Varying Design Flood." Water Resources Management 35, no. 3 (2021): 933–48. http://dx.doi.org/10.1007/s11269-020-02758-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Rébillat, Marc, Ouadie Hmad, Farid Kadri, and Nazih Mechbal. "Peaks Over Threshold–based detector design for structural health monitoring: Application to aerospace structures." Structural Health Monitoring 17, no. 1 (2017): 91–107. http://dx.doi.org/10.1177/1475921716685039.

Full text
Abstract:
Structural health monitoring offers new approaches to interrogate the integrity of complex structures. The structural health monitoring process classically relies on four sequential steps: damage detection, localization, classification, and quantification. The most critical step of such process is the damage detection step since it is the first one and because performances of the following steps depend on it. A common method to design such a detector consists of relying on a statistical characterization of the damage indexes available in the healthy behavior of the structure. On the basis of this information, a decision threshold can then be computed in order to achieve a desired probability of false alarm. To determine the decision threshold corresponding to such desired probability of false alarm, the approach considered here is based on a model of the tail of the damage indexes distribution built using the Peaks Over Threshold method extracted from the extreme value theory. This approach of tail distribution estimation is interesting since it is not necessary to know the whole distribution of the damage indexes to develop a detector, but only its tail. This methodology is applied here in the context of a composite aircraft nacelle (where desired probability of false alarm is typically between 10−4 and 10−9) for different configurations of learning sample size and probability of false alarm and is compared to a more classical one which consists of modeling the entire damage indexes distribution by means of Parzen windows. Results show that given a set of data in the healthy state, the effective probability of false alarm obtained using the Peaks Over Threshold method is closer to the desired probability of false alarm than the one obtained using the Parzen-window method, which appears to be more conservative.
APA, Harvard, Vancouver, ISO, and other styles
14

Ceresetti, D., E. Ursu, J. Carreau, et al. "Evaluation of classical spatial-analysis schemes of extreme rainfall." Natural Hazards and Earth System Sciences 12, no. 11 (2012): 3229–40. http://dx.doi.org/10.5194/nhess-12-3229-2012.

Full text
Abstract:
Abstract. Extreme rainfall is classically estimated using raingauge data at raingauge locations. An important related issue is to assess return levels of extreme rainfall at ungauged sites. Classical methods consist in interpolating extreme-value models. In this paper, such methods are referred to as regionalization schemes. Our goal is to evaluate three classical regionalization schemes. Each scheme consists of an extreme-value model (block maxima, peaks over threshold) taken from extreme-value theory plus a method to interpolate the parameters of the statistical model throughout the Cévennes-Vivarais region. From the interpolated parameters, the 100-yr quantile level can be estimated over this whole region. A reference regionalization scheme is made of the couple block maxima/kriging, where kriging is an optimal interpolation method. The two other schemes differ from the reference by replacing either the extreme-value model block maxima by peaks over threshold or kriging by a neural network interpolation procedure. Hyper-parameters are selected by cross-validation and the three regionalization schemes are compared by double cross-validation. Our evaluation criteria are based on the ability to interpolate the 100-yr return level both in terms of precision and spatial distribution. It turns out that the best results are obtained by the regionalization scheme combining the peaks-over-threshold method with kriging.
APA, Harvard, Vancouver, ISO, and other styles
15

Lechner, J. A., E. Simiu, and N. A. Heckert. "Assessment of ‘peaks over threshold’ methods for estimating extreme value distribution tails." Structural Safety 12, no. 4 (1993): 305–14. http://dx.doi.org/10.1016/0167-4730(93)90059-a.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Gao, Siyang, Leyuan Shi, and Zhengjun Zhang. "A peak-over-threshold search method for global optimization." Automatica 89 (March 2018): 83–91. http://dx.doi.org/10.1016/j.automatica.2017.12.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Wu, Guangrun, and Wenliang Qiu. "Threshold Selection for POT Framework in the Extreme Vehicle Loads Analysis Based on Multiple Criteria." Shock and Vibration 2018 (2018): 1–9. http://dx.doi.org/10.1155/2018/4654659.

Full text
Abstract:
Extreme value of vehicle load plays an important role in bridge design and risk assessment. Peaks over threshold (POT) is a method commonly used in extreme load estimation. The selection of thresholds for the POT method is extremely crucial, but the selected optimal threshold varies under different test criteria. Therefore, a method to select the suitable threshold is developed based on multiple criteria decision analysis (MCDA) in the paper. In MCDA, Chi-Square (χ2) test, Kolmogorov–Smirnov (K–S) test, and Root Mean Square Error (RMSE) in probability distribution functions (PDF) are employed as the test criteria and the weight of these criteria is calculated using the entropy method. Finally, vehicle loads obtained from simulation and field measurement are adopted to validate the effectiveness and feasibility of the proposed method. The results indicate that the proposed MCDA is a useful and complementary tool for threshold selection in the extreme value analysis.
APA, Harvard, Vancouver, ISO, and other styles
18

Naess, A., and P. H. Clausen. "Combination of the peaks-over-threshold and bootstrapping methods for extreme value prediction." Structural Safety 23, no. 4 (2001): 315–30. http://dx.doi.org/10.1016/s0167-4730(02)00015-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Pandey, M. D., and Y. An. "The analysis of design wind speed estimates specified in the National Building Code of Canada." Canadian Journal of Civil Engineering 34, no. 4 (2007): 513–24. http://dx.doi.org/10.1139/l06-133.

Full text
Abstract:
The design wind pressures specified in the 2005 National Building Code of Canada (NBCC) have been derived from the Gumbel distribution fitted to annual maximum wind speed data collected up to early 1990s. The statistical estimates of the annual maxima method are affected by a relatively large sampling variability, since the method considers a fairly small subset of available wind speed records. Advanced statistical methods have emerged in recent years with the purpose of reducing both sampling and model uncertainties associated with extreme quantile estimates. The two most notable methods are the peaks-over-threshold (POT) and annually r largest order statistics (r-LOS), which extend the data set by including additional maxima observed in wind speed time series. The objective of the paper is to explore the use of advanced extreme value theory for updating the design wind speed estimates specified in the Canadian building design code. The paper re-examines the NBCC method for design wind speed estimation and presents the analysis of the latest Canadian wind speed data by POT, r-LOS, and annual maxima methods. The paper concludes that r-LOS method is an effective alternative for the estimation of extreme design wind speed.Key words: wind speed, extreme value theory, order statistics, return period, maximum likelihood method, peaks-over-threshold method, generalized extreme value distribution, Gumbel distribution, generalized Pareto distribution.
APA, Harvard, Vancouver, ISO, and other styles
20

Jabbar, Ismail Abdul, and Nining Sari Ningsih. "Extreme Wave Height Analysis in Natuna Sea Using Peak-Over Threshold Method." IOP Conference Series: Earth and Environmental Science 618 (December 22, 2020): 012024. http://dx.doi.org/10.1088/1755-1315/618/1/012024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Zhou, C. R., Y. F. Chen, S. H. Gu, Q. Huang, J. C. Yuan, and S. N. Yu. "A new threshold selection method for peak over for nonstationary time series." IOP Conference Series: Earth and Environmental Science 39 (August 2016): 012071. http://dx.doi.org/10.1088/1755-1315/39/1/012071.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Lemos, Iago, Antônio Lima, and Marcus Duarte. "thresholdmodeling: A Python package for modeling excesses over a threshold using the Peak-Over-Threshold Method and the Generalized Pareto Distribution." Journal of Open Source Software 5, no. 46 (2020): 2013. http://dx.doi.org/10.21105/joss.02013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Beguería, Santiago, and Sergio M. Vicente-Serrano. "Mapping the Hazard of Extreme Rainfall by Peaks over Threshold Extreme Value Analysis and Spatial Regression Techniques." Journal of Applied Meteorology and Climatology 45, no. 1 (2006): 108–24. http://dx.doi.org/10.1175/jam2324.1.

Full text
Abstract:
Abstract The occurrence of rainfalls of high magnitude constitutes a primary natural hazard in many parts of the world, and the elaboration of maps showing the hazard of extreme rainfalls has great theoretical and practical interest. In this work a procedure based on extreme value analysis and spatial interpolation techniques is described. The result is a probability model in which the distribution parameters vary smoothly in space. This methodology is applied to the middle Ebro Valley (Spain), a climatically complex area with great contrasts because of the relief and exposure to different air masses. The database consists of 43 daily precipitation series from 1950 to 2000. Because rainfall tends to occur highly clustered in time in the area, a declustering process was applied to the data, and the series of daily cluster maxima were used hereinafter. The mean excess plot and error minimizing were used to find an optimum threshold value to retain the highest records (peaks-over-threshold approach), and a Poisson–generalized Pareto model was fitted to the resulting series. The at-site parameter estimates (location, scale, and shape) were regressed upon a set of location and relief variables, enabling the construction of a spatially explicit probability model. The advantages of this method to obtain maps of extreme precipitation hazard are discussed in depth.
APA, Harvard, Vancouver, ISO, and other styles
24

Kyselý, Jan, Jan Picek, and Romana Beranová. "Erratum to “Estimating extremes in climate change simulations using the peaks-over-threshold method with a non-stationary threshold” [Global and Planetary Change 72 (2010) 55-68]." Global and Planetary Change 88-89 (May 2012): 98–99. http://dx.doi.org/10.1016/j.gloplacha.2012.03.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

El-Aroui, Mhamed-Ali, and Jean Diebolt. "On the use of the peaks over thresholds method for estimating out-of-sample quantiles." Computational Statistics & Data Analysis 39, no. 4 (2002): 453–75. http://dx.doi.org/10.1016/s0167-9473(01)00087-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Bačová-Mitková, Veronika, and Milan Onderka. "Analysis of extreme hydrological Events on THE danube using the Peak Over Threshold method." Journal of Hydrology and Hydromechanics 58, no. 2 (2010): 88–101. http://dx.doi.org/10.2478/v10098-010-0009-x.

Full text
Abstract:
Analysis of extreme hydrological Events on THE danube using the Peak Over Threshold methodThe Peak Over Threshold Method (POT) was used as an alternative technique to the traditional analysis of annual discharge maxima of the Danube River. The POT method was applied to a time-series of daily discharge values covering a period of 60 years (1931-1990) at the following gauge stations: Achleiten, Kienstock, Wien, Bratislava and Nagymaros. The first part of the paper presents the use of the POT method and how it was applied to daily discharges. All mean daily discharges exceeding a defined threshold were considered in the POT analysis. Based on the POT waves independence criteria the maximum daily discharge data were selected. Two theoretical log-normal (LN) and Log-Pearson III (LP3) distributions were used to calculate the probability of exceeding annual maximum discharges. Performance of the POT method was compared to the theoretical distributions (LN, LP3). The influence of the data series length on the estimation of theN-year discharges by POT method was carried out too. Therefore, with regard to later regulations along the Danube channel bank the 40, 20 and 10-year time data series were chosen in early of the 60-year period and second analysed time data series were selected from the end of the 60-year period. Our results suggest that the POT method can provide adequate and comparable estimates ofN-year discharges for more stations with short temporal coverage.
APA, Harvard, Vancouver, ISO, and other styles
27

Luo, Y., D. Sui, H. Shi, Z. Zhou, and D. Wang. "Multivariate extreme value analysis of storm surges in SCS on peak over threshold method." Ocean Science Discussions 12, no. 6 (2015): 2783–805. http://dx.doi.org/10.5194/osd-12-2783-2015.

Full text
Abstract:
Abstract. We use a novel statistical approach-MGPD to analyze the joint probability distribution of storm surge events at two sites and present a warning method for storm surges at two adjacent positions in Beibu Gulf, using the sufficiently long field data on surge levels at two sites. The methodology also develops the procedure of application of MGPD, which includes joint threshold and Monte Carlo simulation, to handle multivariate extreme values analysis. By comparing the simulation result with analytic solution, it is shown that the relative error of the Monte Carlo simulation is less than 8.6 %. By running MGPD model based on long data at Beihai and Dongfang, the simulated potential surge results can be employed in storm surge warnings of Beihai and joint extreme water level predictions of two sites.
APA, Harvard, Vancouver, ISO, and other styles
28

Bücher, Axel, Stanislav Volgushev, and Nan Zou. "On second order conditions in the multivariate block maxima and peak over threshold method." Journal of Multivariate Analysis 173 (September 2019): 604–19. http://dx.doi.org/10.1016/j.jmva.2019.04.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Liu, Xintian, Minghui Zhang, Haijie Wang, Jiao Luo, Jiachi Tong, and Xiaolan Wang. "Fatigue life analysis of automotive key parts based on improved peak‐over‐threshold method." Fatigue & Fracture of Engineering Materials & Structures 43, no. 8 (2020): 1824–36. http://dx.doi.org/10.1111/ffe.13235.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Kumar, Mukesh, Mohammed Sharif, and Sirajuddin Ahmed. "Flood estimation at Hathnikund Barrage, River Yamuna, India using the Peak-Over-Threshold method." ISH Journal of Hydraulic Engineering 26, no. 3 (2018): 291–300. http://dx.doi.org/10.1080/09715010.2018.1485119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Naess, A., and P. H. Clausen. "The Impact of Data Accuracy on the POT Estimates of Long Return Period Design Values." Journal of Offshore Mechanics and Arctic Engineering 124, no. 1 (2001): 53–58. http://dx.doi.org/10.1115/1.1446077.

Full text
Abstract:
The main focus of this paper is an investigation of the effect which the accuracy of data representation may have on the results obtained by using standard peaks-over-threshold (POT) methods. It is shown that this effect may lead to a substantial shift in the resulting predictions of long return period extreme values for specific types of estimators combined with the POT method. A brief discussion of the implication for the estimation of confidence intervals on the various estimates will also be given.
APA, Harvard, Vancouver, ISO, and other styles
32

Bee, Marco. "Dynamic value-at-risk models and the peaks-over-threshold method for market risk measurement: an empirical investigation during a financial crisis." Journal of Risk Model Validation 6, no. 2 (2012): 3–45. http://dx.doi.org/10.21314/jrmv.2012.087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Durocher, Martin, Donald H. Burn, and Fahim Ashkar. "Comparison of Estimation Methods for a Nonstationary Index‐Flood Model in Flood Frequency Analysis Using Peaks Over Threshold." Water Resources Research 55, no. 11 (2019): 9398–416. http://dx.doi.org/10.1029/2019wr025305.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Wang, Jixin, Qingsong Liu, and Xinting Zhai. "A method for threshold selection in load extrapolation based on multiple criteria decision-making technology." International Journal of Modeling, Simulation, and Scientific Computing 08, no. 03 (2017): 1750019. http://dx.doi.org/10.1142/s1793962317500192.

Full text
Abstract:
In order to establish an accurate peak over threshold (POT) model for reasonable load extrapolation, a new threshold selection method based on multiple criteria decision making (MCDM) technology is proposed. The fitting test criterion is taken into consideration in the method. For each candidate threshold, the fitting values of several fitting test criteria are integrated into a comprehensive evaluation value through entropy method and MCDM technology. The threshold corresponding to the minimum comprehensive evaluation value is assumed as the optimal threshold. A random simulation study is carried out to evaluate the performance of the method and to compare them with other literature methods. Genuine load data are applied the proposed method. Both the results shown that the proposed method could be seen as an additional method that complements existing threshold selection methods.
APA, Harvard, Vancouver, ISO, and other styles
35

Necir, Abdelhakim, Abdelaziz Rassoul, and Djamel Meraghni. "POT-Based Estimation of the Renewal Function of Interoccurrence Times of Heavy-Tailed Risks." Journal of Probability and Statistics 2010 (2010): 1–14. http://dx.doi.org/10.1155/2010/965672.

Full text
Abstract:
Making use of the peaks over threshold (POT) estimation method, we propose a semiparametric estimator for the renewal function of interoccurrence times of heavy-tailed insurance claims with infinite variance. We prove that the proposed estimator is consistent and asymptotically normal, and we carry out a simulation study to compare its finite-sample behavior with respect to the nonparametric one. Our results provide actuaries with confidence bounds for the renewal function of dangerous risks.
APA, Harvard, Vancouver, ISO, and other styles
36

Papaioannou, George, Silvia Kohnová, Tomáš Bacigál, Ján Szolgay, Kamila Hlavčová, and Athanasios Loukas. "Joint modelling of flood peaks and volumes: A copula application for the Danube River." Journal of Hydrology and Hydromechanics 64, no. 4 (2016): 382–92. http://dx.doi.org/10.1515/johh-2016-0049.

Full text
Abstract:
AbstractFlood frequency analysis is usually performed as a univariate analysis of flood peaks using a suitable theoretical probability distribution of the annual maximum flood peaks or peak over threshold values. However, other flood attributes, such as flood volume and duration, are necessary for the design of hydrotechnical projects, too. In this study, the suitability of various copula families for a bivariate analysis of peak discharges and flood volumes has been tested. Streamflow data from selected gauging stations along the whole Danube River have been used. Kendall’s rank correlation coefficient (tau) quantifies the dependence between flood peak discharge and flood volume settings. The methodology is applied to two different data samples: 1) annual maximum flood (AMF) peaks combined with annual maximum flow volumes of fixed durations at 5, 10, 15, 20, 25, 30 and 60 days, respectively (which can be regarded as a regime analysis of the dependence between the extremes of both variables in a given year), and 2) annual maximum flood (AMF) peaks with corresponding flood volumes (which is a typical choice for engineering studies). The bivariate modelling of the extracted peak discharge - flood volume couples is achieved with the use of the Ali-Mikhail-Haq (AMH), Clayton, Frank, Joe, Gumbel, Hüsler-Reiss, Galambos, Tawn, Normal, Plackett and FGM copula families. Scatterplots of the observed and simulated peak discharge - flood volume pairs and goodness-of-fit tests have been used to assess the overall applicability of the copulas as well as observing any changes in suitable models along the Danube River. The results indicate that for the second data sampling method, almost all of the considered Archimedean class copula families perform better than the other copula families selected for this study, and that for the first method, only the upper-tail-flat copulas excel (except for the AMH copula due to its inability to model stronger relationships).
APA, Harvard, Vancouver, ISO, and other styles
37

Yang, Xia, Jing Zhang, and Wei-Xin Ren. "Threshold selection for extreme value estimation of vehicle load effect on bridges." International Journal of Distributed Sensor Networks 14, no. 2 (2018): 155014771875769. http://dx.doi.org/10.1177/1550147718757698.

Full text
Abstract:
In the design and condition assessment of bridges, the extreme vehicle load effects are necessary to be taken into consideration, which may occur during the service period of bridges. In order to obtain an accurate extrapolation of the extreme value based on limited duration, threshold selection is a critical step in the peak-over-threshold method. Overly high threshold results in little information to be used and excessively low threshold leads to large bias in parameters estimation of generalized Pareto distribution. To investigate this issue, 417 days of strain data acquired from the long-term structural health monitoring system of Taiping Lake Bridge in China are employed in this article. According to the tail distribution of the strain data induced by vehicle loads, four homothetic distributions are chosen as its parent distribution, from which lots of random samples are generated by the Monte Carlo method. For each parent distribution, the 100-yearly extreme values at different thresholds are estimated and compared with the theoretical value based on those samples. Then a simple and empirical threshold selection method is proposed and applied to estimate the weekly extreme strain due to vehicle loads on the Taiping Lake Bridge. Results show that the estimate on the basis of the threshold obtained by the proposed method is closer to the measured result than the commonly used methods. The proposed method can be an effective threshold selection tool for the extreme value estimation of vehicle load effect in future engineering practice.
APA, Harvard, Vancouver, ISO, and other styles
38

Xu, Zhi Qiang. "Determination Method of a Proper Small-Load-Omitting Criterion Based on the Extreme Load." Advanced Materials Research 694-697 (May 2013): 278–83. http://dx.doi.org/10.4028/www.scientific.net/amr.694-697.278.

Full text
Abstract:
A crucial step to obtain a reliable fatigue life prediction is to determine a proper small load threshold below which the cycles at small loads or stresses with high frequency causing little fatigue damage are truncated from the original load time history. By taking both the peak over threshold theory and the endurance limit threshold into account, a proper small load threshold is proposed in this paper. The optimal threshold should be high enough to remove the high-frequency small cycles and low enough to minimize the loss of the fatigue damage which maybe be truncated by the empirical small-load omitting threshold. Based on this proper threshold, the fatigue life prediction will be more reliable.
APA, Harvard, Vancouver, ISO, and other styles
39

Chen, Yan-Qing, Sheng Zheng, Yan-Shan Xiao, Shu-Guang Zeng, Tuan-Hui Zhou, and Gang-Hua Lin. "Chinese Sunspot Drawings and Their Digitizations-(VI) Extreme Value Theory Applied to the Sunspot Number Series from the Purple Mountain Observatory." Atmosphere 12, no. 9 (2021): 1176. http://dx.doi.org/10.3390/atmos12091176.

Full text
Abstract:
Based on the daily sunspot number (SN) data (1954–2011) from the Purple Mountain Observatory, the extreme value theory (EVT) is employed for the research of the long-term solar activity. It is the first time that the EVT is applied on the Chinese SN. Two methods are used for the research of the extreme events with EVT. One method is the block maxima (BM) approach, which picks the maximum SN value of each block. Another one is the peaks-over-threshold (POT) approach. After a declustering process, a threshold value (here it is 300) is set to pick the extreme values. The negative shape parameters are obtained by the two methods, respectively, indicating that there is an upper bound for the extreme SN value. Only one value of the N-year return level (RL) is estimated: N = 19 years. For N = 19 years, the RL values of SN obtained by two methods are similar with each other. The RL values are found to be 420 for the POT method and the BM method. Here, the trend of 25th solar cycle is predicted to be stronger, indicating that the length of meridional forms of atmospheric circulation will be increased.
APA, Harvard, Vancouver, ISO, and other styles
40

Higuera, Philip E., Daniel G. Gavin, Patrick J. Bartlein, and Douglas J. Hallett. "Peak detection in sediment - charcoal records: impacts of alternative data analysis methods on fire-history interpretations." International Journal of Wildland Fire 19, no. 8 (2010): 996. http://dx.doi.org/10.1071/wf09134.

Full text
Abstract:
Over the past several decades, high-resolution sediment–charcoal records have been increasingly used to reconstruct local fire history. Data analysis methods usually involve a decomposition that detrends a charcoal series and then applies a threshold value to isolate individual peaks, which are interpreted as fire episodes. Despite the proliferation of these studies, methods have evolved largely in the absence of a thorough statistical framework. We describe eight alternative decomposition models (four detrending methods used with two threshold-determination methods) and evaluate their sensitivity to a set of known parameters integrated into simulated charcoal records. Results indicate that the combination of a globally defined threshold with specific detrending methods can produce strongly biased results, depending on whether or not variance in a charcoal record is stationary through time. These biases are largely eliminated by using a locally defined threshold, which adapts to changes in variability throughout a charcoal record. Applying the alternative decomposition methods on three previously published charcoal records largely supports our conclusions from simulated records. We also present a minimum-count test for empirical records, which reduces the likelihood of false positives when charcoal counts are low. We conclude by discussing how to evaluate when peak detection methods are warranted with a given sediment–charcoal record.
APA, Harvard, Vancouver, ISO, and other styles
41

Naess, A., O. Gaidai, and O. Karpa. "Estimation of Extreme Values by the Average Conditional Exceedance Rate Method." Journal of Probability and Statistics 2013 (2013): 1–15. http://dx.doi.org/10.1155/2013/797014.

Full text
Abstract:
This paper details a method for extreme value prediction on the basis of a sampled time series. The method is specifically designed to account for statistical dependence between the sampled data points in a precise manner. In fact, if properly used, the new method will provide statistical estimates of the exact extreme value distribution provided by the data in most cases of practical interest. It avoids the problem of having to decluster the data to ensure independence, which is a requisite component in the application of, for example, the standard peaks-over-threshold method. The proposed method also targets the use of subasymptotic data to improve prediction accuracy. The method will be demonstrated by application to both synthetic and real data. From a practical point of view, it seems to perform better than the POT and block extremes methods, and, with an appropriate modification, it is directly applicable to nonstationary time series.
APA, Harvard, Vancouver, ISO, and other styles
42

Yang, R. X., C. Li, Y. J. Sun, et al. "Multi-peak pattern in Multi-gap RPC time-over-threshold distributions and an offline calibration method." Journal of Instrumentation 12, no. 01 (2017): C01012. http://dx.doi.org/10.1088/1748-0221/12/01/c01012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Li, Hai Biao, and Xin Xia. "Improved Algorithm for Wavelet Threshold Denoising and Application in Crack Image Recognition." Advanced Materials Research 945-949 (June 2014): 1846–50. http://dx.doi.org/10.4028/www.scientific.net/amr.945-949.1846.

Full text
Abstract:
In crack image recognition, Donoho’s universal wavelet threshold de-noising method appears "over-kill" phenomenon due to the lack of self-adaptability of threshold value; hence the image may lose its edge details. To handle this problem, the Donoho’s universal threshold and threshold function are improved and an adaptive determination method of threshold coefficient is introduced in this paper. Experimental results shows that the proposed method can effectively remove digital image noise and achieve a better edge protection, higher edge preservation index, better visual effects and higher peak signal-to-noise ratio.
APA, Harvard, Vancouver, ISO, and other styles
44

Filipova, Valeriya, Deborah Lawrence, and Thomas Skaugen. "A stochastic event-based approach for flood estimation in catchments with mixed rainfall and snowmelt flood regimes." Natural Hazards and Earth System Sciences 19, no. 1 (2019): 1–18. http://dx.doi.org/10.5194/nhess-19-1-2019.

Full text
Abstract:
Abstract. The estimation of extreme floods is associated with high uncertainty, in part due to the limited length of streamflow records. Traditionally, statistical flood frequency analysis and an event-based model (PQRUT) using a single design storm have been applied in Norway. We here propose a stochastic PQRUT model, as an extension of the standard application of the event-based PQRUT model, by considering different combinations of initial conditions, rainfall and snowmelt, from which a distribution of flood peaks can be constructed. The stochastic PQRUT was applied for 20 small- and medium-sized catchments in Norway and the results give good fits to observed peak-over-threshold (POT) series. A sensitivity analysis of the method indicates (a) that the soil saturation level is less important than the rainfall input and the parameters of the PQRUT model for flood peaks with return periods higher than 100 years and (b) that excluding the snow routine can change the seasonality of the flood peaks. Estimates for the 100- and 1000-year return level based on the stochastic PQRUT model are compared with results for (a) statistical frequency analysis and (b) a standard implementation of the event-based PQRUT method. The differences in flood estimates between the stochastic PQRUT and the statistical flood frequency analysis are within 50 % in most catchments. However, the differences between the stochastic PQRUT and the standard implementation of the PQRUT model are much higher, especially in catchments with a snowmelt flood regime.
APA, Harvard, Vancouver, ISO, and other styles
45

Alfieri, L., P. Burek, L. Feyen, and G. Forzieri. "Global warming increases the frequency of river floods in Europe." Hydrology and Earth System Sciences 19, no. 5 (2015): 2247–60. http://dx.doi.org/10.5194/hess-19-2247-2015.

Full text
Abstract:
Abstract. EURO-CORDEX (Coordinated Downscaling Experiment over Europe), a new generation of downscaled climate projections, has become available for climate change impact studies in Europe. New opportunities arise in the investigation of potential effects of a warmer world on meteorological and hydrological extremes at regional scales. In this work, an ensemble of EURO-CORDEX RCP8.5 scenarios is used to drive a distributed hydrological model and assess the projected changes in flood hazard in Europe through the current century. Changes in magnitude and frequency of extreme streamflow events are investigated by statistical distribution fitting and peak over threshold analysis. A consistent method is proposed to evaluate the agreement of ensemble projections. Results indicate that the change in frequency of discharge extremes is likely to have a larger impact on the overall flood hazard as compared to the change in their magnitude. On average, in Europe, flood peaks with return periods above 100 years are projected to double in frequency within 3 decades.
APA, Harvard, Vancouver, ISO, and other styles
46

MARINELLI, CARLO, STEFANO D'ADDONA, and SVETLOZAR T. RACHEV. "A COMPARISON OF SOME UNIVARIATE MODELS FOR VALUE-AT-RISK AND EXPECTED SHORTFALL." International Journal of Theoretical and Applied Finance 10, no. 06 (2007): 1043–75. http://dx.doi.org/10.1142/s0219024907004548.

Full text
Abstract:
We compare in a backtesting study the performance of univariate models for Value-at-Risk (VaR) and expected shortfall based on stable laws and on extreme value theory (EVT). Analyzing these different approaches, we test whether the sum–stability assumption or the max–stability assumption, that respectively imply α–stable laws and Generalized Extreme Value (GEV) distributions, is more suitable for risk management based on VaR and expected shortfall. Our numerical results indicate that α–stable models tend to outperform pure EVT-based methods (especially those obtained by the so-called block maxima method) in the estimation of Value-at-Risk, while a peaks-over-threshold method turns out to be preferable for the estimation of expected shortfall. We also find empirical evidence that some simple semiparametric EVT-based methods perform well in the estimation of VaR.
APA, Harvard, Vancouver, ISO, and other styles
47

Hirschberg, Jacob, Alexandre Badoux, Brian W. McArdell, Elena Leonarduzzi, and Peter Molnar. "Evaluating methods for debris-flow prediction based on rainfall in an Alpine catchment." Natural Hazards and Earth System Sciences 21, no. 9 (2021): 2773–89. http://dx.doi.org/10.5194/nhess-21-2773-2021.

Full text
Abstract:
Abstract. The prediction of debris flows is relevant because this type of natural hazard can pose a threat to humans and infrastructure. Debris-flow (and landslide) early warning systems often rely on rainfall intensity–duration (ID) thresholds. Multiple competing methods exist for the determination of such ID thresholds but have not been objectively and thoroughly compared at multiple scales, and a validation and uncertainty assessment is often missing in their formulation. As a consequence, updating, interpreting, generalizing and comparing rainfall thresholds is challenging. Using a 17-year record of rainfall and 67 debris flows in a Swiss Alpine catchment (Illgraben), we determined ID thresholds and associated uncertainties as a function of record duration. Furthermore, we compared two methods for rainfall definition based on linear regression and/or true-skill-statistic maximization. The main difference between these approaches and the well-known frequentist method is that non-triggering rainfall events were also considered for obtaining ID-threshold parameters. Depending on the method applied, the ID-threshold parameters and their uncertainties differed significantly. We found that 25 debris flows are sufficient to constrain uncertainties in ID-threshold parameters to ±30 % for our study site. We further demonstrated the change in predictive performance of the two methods if a regional landslide data set with a regional rainfall product was used instead of a local one with local rainfall measurements. Hence, an important finding is that the ideal method for ID-threshold determination depends on the available landslide and rainfall data sets. Furthermore, for the local data set we tested if the ID-threshold performance can be increased by considering other rainfall properties (e.g. antecedent rainfall, maximum intensity) in a multivariate statistical learning algorithm based on decision trees (random forest). The highest predictive power was reached when the peak 30 min rainfall intensity was added to the ID variables, while no improvement was achieved by considering antecedent rainfall for debris-flow predictions in Illgraben. Although the increase in predictive performance with the random forest model over the classical ID threshold was small, such a framework could be valuable for future studies if more predictors are available from measured or modelled data.
APA, Harvard, Vancouver, ISO, and other styles
48

Zhang, Xujie, Martijn J. Booij, and Yue-Ping Xu. "Improved Simulation of Peak Flows under Climate Change: Postprocessing or Composite Objective Calibration?" Journal of Hydrometeorology 16, no. 5 (2015): 2187–208. http://dx.doi.org/10.1175/jhm-d-14-0218.1.

Full text
Abstract:
Abstract Climate change is expected to have large impacts on peak flows. However, there may be bias in the simulation of peak flows by hydrological models. This study aims to improve the simulation of peak flows under climate change in Lanjiang catchment, east China, by comparing two approaches: postprocessing of peak flows and composite objective calibration. Two hydrological models [Soil and Water Assessment Tool (SWAT) and modèle du Génie Rural à 4 paramètres Journalier (GR4J)] are employed to simulate the daily flows, and the peaks-over-threshold method is used to extract peak flows from the simulated daily flows. Three postprocessing methods, namely, the quantile mapping method and two generalized linear models, are set up to correct the biases in the simulated raw peak flows. A composite objective calibration of the GR4J model by taking the peak flows into account in the calibration process is also carried out. The regional climate model Providing Regional Climates for Impacts Studies (PRECIS) with boundary forcing from two GCMs (HadCM3 and ECHAM5) under greenhouse gas emission scenario A1B is applied to produce the climate data for the baseline period and the future period 2011–40. The results show that the postprocessing methods, particularly quantile mapping method, can correct the biases in the raw peak flows effectively. The composite objective calibration also resulted in a good simulation performance of peak flows. The final estimated peak flows in the future period show an obvious increase compared with those in the baseline period, indicating there will probably be more frequent floods in Lanjiang catchment in the future.
APA, Harvard, Vancouver, ISO, and other styles
49

Fukutome, S., M. A. Liniger, and M. Süveges. "Automatic threshold and run parameter selection: a climatology for extreme hourly precipitation in Switzerland." Theoretical and Applied Climatology 120, no. 3-4 (2014): 403–16. http://dx.doi.org/10.1007/s00704-014-1180-5.

Full text
Abstract:
Abstract Extreme value analyses of a large number of relatively short time series are in increasing demand in environmental sciences and design. Here, we present an automated procedure for the peaks-over-threshold (POT) approach to extreme value theory and use it to provide a climatology of extreme hourly precipitation in Switzerland. The POT approach fits the generalized Pareto distribution (GPD) to independent exceedances above some high threshold. To guarantee independence, the time series is pruned: exceedances separated by less than a fixed interval called the run parameter are considered a cluster, and all but the cluster maxima are discarded. We propose the automation of an existing graphical method for joint selection of threshold and run parameter. Hourly precipitation is analyzed at 59 stations of the MeteoSwiss observational network over the period 1981–2010. The four seasons are considered separately. When necessary, a simple detrending is applied. Results suggest that unnecessarily large run parameters have adverse effects on the estimation of the GPD parameters. The proposed method yields mean cluster sizes that reflect the seasonal and geographical variation of lag dependence of hourly precipitation. The climatology, as represented by the return level maps and Alpine cross-section, mirror known aspects of the Swiss climate. Unlike for daily precipitation, summer thunderstorm tracks are visible in the seasonal frequency of events, rather than in the amplitude of rare events.
APA, Harvard, Vancouver, ISO, and other styles
50

Outten, S. D., and I. Esau. "Extreme winds over Europe in the ENSEMBLES regional climate models." Atmospheric Chemistry and Physics Discussions 13, no. 1 (2013): 1179–99. http://dx.doi.org/10.5194/acpd-13-1179-2013.

Full text
Abstract:
Abstract. Extreme winds cause vast amounts of damage every year and represent a major concern for numerous industries including construction, afforestation, wind energy and many others. Under a changing climate, the intensity and frequency of extreme events are expected to change, and accurate predictions of these changes will be invaluable to decision makers and society as a whole. This work examines four regional climate model downscalings over Europe from the "ENSEMBLE-based Predictions of Climate Changes and their Impacts" project (ENSEMBLES), and investigates the predicted changes in the 50 yr return wind speeds and the associated uncertainties. This is accomplished by employing the peaks-over-threshold method with the use of the Generalised Pareto Distribution. The models show that for much of Europe the 50 yr return wind is projected to change by less than 2 m s−1, while the uncertainties associated with the statistical estimates are larger than this. In keeping with previous works in this field, the largest source of uncertainty is found to be the inter-model spread, with some locations showing differences in the 50 yr return wind of over 20 m s−1 between two different downscalings.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography