To see the other types of publications on this topic, follow the link: Risk-neutral probability density function.

Journal articles on the topic 'Risk-neutral probability density function'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Risk-neutral probability density function.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Rompolis, Leonidas S., and Elias Tzavalis. "Recovering Risk Neutral Densities from Option Prices: A New Approach." Journal of Financial and Quantitative Analysis 43, no. 4 (December 2008): 1037–53. http://dx.doi.org/10.1017/s0022109000014435.

Full text
Abstract:
AbstractIn this paper we present a new method of approximating the risk neutral density (RND) from option prices based on the C-type Gram-Charlier series expansion (GCSE) of a probability density function. The exponential form of this type of GCSE guarantees that it will always give positive values of the risk neutral probabilities, and it can allow for stronger deviations from normality, which are two drawbacks of the A-type GCSE used in practice. To evaluate the performance of the suggested expansion of the RND, the paper presents simulation and empirical evidence.
APA, Harvard, Vancouver, ISO, and other styles
2

Malhotra, Gifty, R. Srivastava, and H. C. Taneja. "Calibration of the risk-neutral density function by maximization of a two-parameter entropy." Physica A: Statistical Mechanics and its Applications 513 (January 2019): 45–54. http://dx.doi.org/10.1016/j.physa.2018.08.148.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Campioni, Luca, and Paolo Vestrucci. "On system failure probability density function." Reliability Engineering & System Safety 92, no. 10 (October 2007): 1321–27. http://dx.doi.org/10.1016/j.ress.2006.09.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sinha, Sonalika, and Bandi Kamaiah. "Estimating Option-implied Risk Aversion for Indian Markets." IIM Kozhikode Society & Management Review 6, no. 1 (January 2017): 90–97. http://dx.doi.org/10.1177/2277975216677600.

Full text
Abstract:
What do nearly 1.5 lakh observations of options data say about risk preferences of Indian investors? This paper explores a nonparametric technique to compute probability density functions (PDFs) directly from NIFTY 50 option prices in India, based on the utility preferences of the representative investor. Use of probability density functions to estimate investor expectations of the distribution of future levels of the underlying assets has gained tremendous popularity over the last decade. Studying option prices provides information about the market participants’ probability assessment of the future outcome of the underlying asset. We compare the forecast ability of the risk-neutral PDF and risk-adjusted density functions to arrive at a unique index of relative risk aversion for Indian markets. Results indicate that risk-adjusted PDFs are reasonably better forecasts of investor expectations of future levels of the underlying assets. We find that Indian investors are not neutral to risk, contrary to the theoretical assumption of risk-neutrality among investors. The computed time-series of relative risk aversion overcomes the limitations of the VIX (implied volatility index) to yield a more reliable index, particularly useful for the Indian markets. Validity of the computed index is established by comparing with existing measures of risk and the relationships are found to be consistent with market expectations.
APA, Harvard, Vancouver, ISO, and other styles
5

Cheng, Kevin C. "A New Framework to Estimate the Risk-Neutral Probability Density Functions Embedded in Options Prices." IMF Working Papers 10, no. 181 (2010): 1. http://dx.doi.org/10.5089/9781455202157.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Monteiro, Ana Margarida, Reha H. Tütüncü, and Luís N. Vicente. "Recovering risk-neutral probability density functions from options prices using cubic splines and ensuring nonnegativity." European Journal of Operational Research 187, no. 2 (June 2008): 525–42. http://dx.doi.org/10.1016/j.ejor.2007.02.041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Arnerić, Josip, and Maria Čuljak. "Predictive accuracy of option pricing models considering high-frequency data." Ekonomski vjesnik 34, no. 1 (2021): 131–44. http://dx.doi.org/10.51680/ev.34.1.10.

Full text
Abstract:
Purpose: Recently, considerable attention has been given to forecasting, not only the mean and the variance, but also the entire probability density function (pdf) of the underlying asset. These forecasts can be obtained as implied moments of future distribution originating from European call and put options. However, the predictive accuracy of option pricing models is not so well established. With this in mind, this research aims to identify the model that predicts the entire pdf most accurately when compared to the ex-post “true” density given by high-frequency data at expiration date. Methodology: The methodological part includes two steps. In the first step, several probability density functions are estimated using different option pricing models, considering the values of major market indices with different maturities. These implied probability density functions are risk neutral. In the second step, the implied pdfs are compared against the “true” density obtained from the high-frequency data to examine which one gives the best fit out-of-sample. Results: The results support the idea that a “true” density function, although unknown, can be estimated by employing the kernel estimator within high-frequency data and adjusted for risk preferences. Conclusion: The main conclusion is that the Shimko model outperforms the Mixture Log-Normal model as well as the Edgeworth expansion model in terms of out-of-sample forecasting accuracy. This study contributes to the existing body of research by: i) establishing the benchmark of the “true” density function using high-frequency data, ii) determining the predictive accuracy of the option pricing models and iii) providing applicative results both for market participants and public authorities.
APA, Harvard, Vancouver, ISO, and other styles
8

WENZEL, A., and M. BALDAUF. "On the one-point probability density function for the wind velocity in the neutral atmospheric surface layer." Journal of Fluid Mechanics 366 (July 10, 1998): 351–65. http://dx.doi.org/10.1017/s0022112098001487.

Full text
Abstract:
The differential equation describing the one-point joint probability density function for the wind velocity given by Lundgren (1967) in neutral turbulent flows is extended by a term which also takes into consideration the pressure–mean strain interaction. For the new equation a solution is given describing the one-point probability density function for the wind velocity fluctuations if the profile of the mean wind velocity is logarithmic. The properties of this solution are discussed to identify the differences to a Gaussian having the same first and second moments.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhou, Chengning, Ning-Cong Xiao, Ming J. Zuo, and Xiaoxu Huang. "AK-PDF: An active learning method combining kriging and probability density function for efficient reliability analysis." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 234, no. 3 (November 29, 2019): 536–49. http://dx.doi.org/10.1177/1748006x19888421.

Full text
Abstract:
An important challenge in structural reliability is to reduce the number of calls to evaluate the performance function, especially the complex implicit performance functions. To reduce the computational burden and improve the reliability analysis efficiency, a new active learning method is developed to consider the probability density function of samples based on the learning function U in an active learning reliability method that combines the kriging and Monte Carlo simulation. In the proposed method, the proposed active learning function contains two parts: part A is based on function U, and part B is based on the probability density function and function U. By changing the weights of parts A and B, the sample points close the limit-state function, and those in the region with a higher probability density function have more weight to be selected compared to the others. Subsequently, the kriging model can be constructed more effectively. The proposed method avoids a large number of time-consuming function evaluations, and the recommended weight is also reported. The performance of the proposed method is evaluated through three numerical examples and one engineering example. The results demonstrate the efficiency and accuracy of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
10

Cortesi, A. B., B. L. Smith, B. Sigg, and S. Banerjee. "Numerical investigation of the scalar probability density function distribution in neutral and stably stratified mixing layers." Physics of Fluids 13, no. 4 (April 2001): 927–50. http://dx.doi.org/10.1063/1.1352622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Knecht, William R. "Developing a Probabilistic Metric of Midair Collision Risk." Transportation Research Record: Journal of the Transportation Research Board 1567, no. 1 (January 1997): 26–32. http://dx.doi.org/10.3141/1567-04.

Full text
Abstract:
The evaluation of single-aircraft-pair midair collision risk in the national airspace system is addressed. A nonlinear mathematical function is proposed to model the most significant feature of this risk. The function estimates probability of collision as a function of time to contact. Previous attempts to model collision risk have met with difficulty when faced with the challenge of integrating multiple, uncertain probability density functions for the individual components of risk. This approach sidesteps this problem by finding a psychometric function for collision-avoidance performance. This function is evaluated logically and by curve fitting to flight simulator data from commercial airline pilots in simulated conflict scenarios. Analysis of these data gives preliminary support for the appropriateness of the approach. Properly parameterized, such a function or family of functions could be projected to estimate the probability of collision associated with a planned aircraft maneuver. This is similar to the idea of the “conflict probe” currently being explored by the aviation community in its effort to institute free flight.
APA, Harvard, Vancouver, ISO, and other styles
12

Ostroumov, Ivan, Karen Marais, Nataliia Kuzmenko, and Nicoletta Fala. "TRIPLE PROBABILITY DENSITY DISTRIBUTION MODEL IN THE TASK OF AVIATION RISK ASSESSMENT." Aviation 24, no. 2 (July 8, 2020): 57–65. http://dx.doi.org/10.3846/aviation.2020.12544.

Full text
Abstract:
The probability of an airplane deviation from pre-planned trajectory is a core of aviation safety analysis. We propose to use a mixture of three probability density distribution functions it the task of aviation risk assessment. Proposed model takes into account the effect of navigation system error, flight technical error, and occurrence of rare events. Univariate Generalized Error Distribution is used as a basic component of distribution functions, that configures the error distribution model from the normal error distribution to double exponential distribution function. Statistical fitting of training sample by proposed Triple Univariate Generalized Error Distribution (TUGED) is supported by Maximum Likelihood Method. Optimal set of parameters is estimated by sequential approximation method with defined level of accuracy. The developed density model has been used in risk assessment of airplane lateral deviation from runway centreline during take-off and landing phases of flight. The efficiency of the developed model is approved by Chi-square, Akaike’s, and Bayes information criteria. The results of TUGED fitting indicate better performance in comparison with double probability density distribution model. The risk of airplane veering off the runway is considered as the probability of a rare event occurrence and is estimated as an area under the TUGED.
APA, Harvard, Vancouver, ISO, and other styles
13

Guo, Runxia, Zhile Wei, and Ye Wei. "State estimation for the electro-hydraulic actuator based on particle filter with an improved resampling technique." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 234, no. 1 (August 29, 2019): 41–51. http://dx.doi.org/10.1177/1748006x19871753.

Full text
Abstract:
State estimation for the electro-hydraulic actuator of civil aircraft is one of the most valuable but intractable issues. Recently, the state estimation approach based on particle filters has widely attracted attention. We pursue the benefits of the data-driven approach when physical model is deficienct, and put forward some improvements that are triggered by the shortcomings of particle filters algorithm. In order to solve the particles’ degeneracy phenomenon in particle filters, a kernel function that integrates the information of probability distribution is constructed; then, the established probability kernel function is designed to represent the probability density function of resampling and the regularization form of probability density function in Hilbert space is defined. Consequently, the probability density function of resampling is obtained by solving the support vector regression model. The novel resampling method based on support vector regression-particle filters can keep the diversity of particles as well as relieve the degeneracy phenomenon and eventually make the estimated state more realistic. The approach is simulated and applied to an electro-hydraulic actuator model. The estimation results validate the effectiveness of the proposed algorithm.
APA, Harvard, Vancouver, ISO, and other styles
14

Katul, Gabriel G. "A model for sensible heat flux probability density function for near-neutral and slightly-stable atmospheric flows." Boundary-Layer Meteorology 71, no. 1-2 (October 1994): 1–20. http://dx.doi.org/10.1007/bf00709217.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Gourieroux, Christian, and Joann Jasiak. "Local Likelihood Density Estimation and Value-at-Risk." Journal of Probability and Statistics 2010 (2010): 1–26. http://dx.doi.org/10.1155/2010/754851.

Full text
Abstract:
This paper presents a new nonparametric method for computing the conditional Value-at-Risk, based on a local approximation of the conditional density function in a neighborhood of a predetermined extreme value for univariate and multivariate series of portfolio returns. For illustration, the method is applied to intraday VaR estimation on portfolios of two stocks traded on the Toronto Stock Exchange. The performance of the new VaR computation method is compared to the historical simulation, variance-covariance, and J. P. Morgan methods.
APA, Harvard, Vancouver, ISO, and other styles
16

Kriebel, David L., and Gina R. Henderson. "ASSESSING CURRENT AND FUTURE NUISANCE FLOOD FREQUENCY THROUGHOUT THE U.S. MID-ATLANTIC." Coastal Engineering Proceedings, no. 36 (December 30, 2018): 106. http://dx.doi.org/10.9753/icce.v36.risk.106.

Full text
Abstract:
Nuisance flooding, which causes public inconveniences such as frequent road closures, overwhelmed storm drains and compromised infrastructure (NOAA, 2017), has noticeably increased at multiple mid-Atlantic coastal locations in recent years. Multiple factors contribute to such flooding events, however mean seal level rise (MSLR) is a primary driver, due to its effect on increasing the exceedance probability of a given storm leading to flooding. Preliminary results show a tendency for a weakly non-Gaussian distribution of the extreme water level probability density function at multiple gage locations, which suggests that dimensionless extreme water level peaks (relative to the mean) will also be related between locations. Implications for both current and future nuisance flood frequency based on these distributions will be discussed.
APA, Harvard, Vancouver, ISO, and other styles
17

Schmid, W., S. Mecklenburg, and J. Joss. "Short-term risk forecasts of heavy rainfall." Water Science and Technology 45, no. 2 (January 1, 2002): 121–25. http://dx.doi.org/10.2166/wst.2002.0036.

Full text
Abstract:
Methodologies for risk forecasts of severe weather hardly exist on the scale of nowcasting (0–3 hours). Here we discuss short-term risk forecasts of heavy precipitation associated with local thunderstorms. We use COTREC/RainCast: a procedure to extrapolate radar images into the near future. An error density function is defined using the estimated error of location of the extrapolated radar patterns. The radar forecast is folded (“smeared”) with the density function, leading to a probability distribution of radar intensities. An algorithm to convert the radar intensities into values of precipitation intensity provides the desired probability (or risk) of heavy rainfall at any position within the considered window in space and time. We discuss, as an example, a flood event from summer 2000.
APA, Harvard, Vancouver, ISO, and other styles
18

Yuanjiang, He, Li Xucheng, and John Zhang. "Some results of ruin probability for the classical risk process." Journal of Applied Mathematics and Decision Sciences 7, no. 3 (January 1, 2003): 133–46. http://dx.doi.org/10.1155/s1173912603000130.

Full text
Abstract:
The computation of ruin probability is an important problem in the collective risk theory. It has applications in the fields of insurance, actuarial science, and economics. Many mathematical models have been introduced to simulate business activities and ruin probability is studied based on these models. Two of these models are the classical risk model and the Cox model. In the classical model, the counting process is a Poisson process and in the Cox model, the counting process is a Cox process. Thorin (1973) studied the ruin probability based on the classical model with the assumption that random sequence followed the Γ distribution with density function f(x)=x1β−1β1βΓ(1/β)e−xβ, x>0, where β>1. This paper studies the ruin probability of the classical model where the random sequence follows the Γ distribution with density function f(x)=αnΓ(n)xn−1e−αx, x>0, where α>0 and n≥2 is a positive integer. An intermediate general result is given and a complete solution is provided for n=2. Simulation studies for the case of n=2 is also provided.
APA, Harvard, Vancouver, ISO, and other styles
19

Zhang, Chunsheng, and Guojing Wang. "The joint density function of three characteristics on jump-diffusion risk process." Insurance: Mathematics and Economics 32, no. 3 (July 2003): 445–55. http://dx.doi.org/10.1016/s0167-6687(03)00133-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Du, Yijun, Chen Wang, and Yibing Du. "Inversion of option prices for implied risk-neutral probability density functions: general theory and its applications to the natural gas market." Quantitative Finance 12, no. 12 (December 2012): 1877–91. http://dx.doi.org/10.1080/14697688.2011.586355.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ta, C. T. "A probability model for burst risk studies of water mains." Water Supply 2, no. 4 (September 1, 2002): 29–35. http://dx.doi.org/10.2166/ws.2002.0117.

Full text
Abstract:
A logistic model was used for burst risk studies. The model is applied to study burst risk of large (300 mm and greater) water mains in the London supply area. Many contributing factors were included and represented as geographical information system (GIS) layers. Detailed sampling techniques were discussed. Using the available data, the burst probability function was found to correlate with pipe number density, pipe diameter, soil corrosivity and number of buses.
APA, Harvard, Vancouver, ISO, and other styles
22

Owada, Keiho, Takashi Okada, Toshio Munesue, Miho Kuroda, Toru Fujioka, Yota Uno, Kaori Matsumoto, et al. "Quantitative facial expression analysis revealed the efficacy and time course of oxytocin in autism." Brain 142, no. 7 (May 16, 2019): 2127–36. http://dx.doi.org/10.1093/brain/awz126.

Full text
Abstract:
Abstract Discrepancies in efficacy between single-dose and repeated administration of oxytocin for autism spectrum disorder have led researchers to hypothesize that time-course changes in efficacy are induced by repeated administrations of the peptide hormone. However, repeatable, objective, and quantitative measurement of autism spectrum disorder’s core symptoms are lacking, making it difficult to examine potential time-course changes in efficacy. We tested this hypothesis using repeatable, objective, and quantitative measurement of the core symptoms of autism spectrum disorder. We examined videos recorded during semi-structured social interaction administered as the primary outcome in single-site exploratory (n = 18, crossover within-subjects design) and multisite confirmatory (n = 106, parallel-group design), double-blind, placebo-controlled 6-week trials of repeated intranasal administrations of oxytocin (48 IU/day) in adult males with autism spectrum disorder. The main outcomes were statistical representative values of objectively quantified facial expression intensity in a repeatable part of the Autism Diagnostic Observation Schedule: the maximum probability (i.e. mode) and the natural logarithm of mode on the probability density function of neutral facial expression and the natural logarithm of mode on the probability density function of happy expression. Our recent study revealed that increases in these indices characterize autistic facial expression, compared with neurotypical individuals. The current results revealed that oxytocin consistently and significantly decreased the increased natural logarithm of mode on the probability density function of neutral facial expression compared with placebo in exploratory (effect-size, −0.57; 95% CI, −1.27 to 0.13; P = 0.023) and confirmatory trials (−0.41; −0.62 to −0.20; P < 0.001). A significant interaction between time-course (at baseline, 2, 4, 6, and 8 weeks) and the efficacy of oxytocin on the natural logarithm of mode on the probability density function of neutral facial expression was found in confirmatory trial (P < 0.001). Post hoc analyses revealed maximum efficacy at 2 weeks (P < 0.001, Cohen’s d = −0.78; 95% CI, −1.21 to −0.35) and deterioration of efficacy at 4 weeks (P = 0.042, Cohen’s d = −0.46; 95% CI, −0.90 to −0.01) and 6 weeks (P = 0.10, Cohen’s d = −0.35; 95% CI, −0.77 to 0.08), while efficacy was preserved at 2 weeks post-treatment (i.e. 8 weeks) (P < 0.001, Cohen’s d = −1.24; 95% CI, −1.71 to −0.78). Quantitative facial expression analyses successfully verified the positive effects of repeated oxytocin on autistic individuals’ facial expressions and demonstrated a time-course change in efficacy. The current findings support further development of an optimized regimen of oxytocin treatment.
APA, Harvard, Vancouver, ISO, and other styles
23

Vallianatos, F. "A non-extensive approach to risk assessment." Natural Hazards and Earth System Sciences 9, no. 1 (February 19, 2009): 211–16. http://dx.doi.org/10.5194/nhess-9-211-2009.

Full text
Abstract:
Abstract. We analytically estimate the risk function of natural hazards (earthquakes, rockfalls, forestfires, landslides) by means of a non-extensive approach which is based on implementing the Tsallis entropy for the estimation of the probability density function (PDF) and introducing a phenomenological exponential expression for the damage function. The result leads to a power law expression as a special case and the b-value is given as a function of the non-extensive parameter q. A discussion of risk function dependence on the parameters of hazard PDF and damage function for various hazards is given.
APA, Harvard, Vancouver, ISO, and other styles
24

Abedi, Mohammad, and Daniel Bartolomeo. "Entropic Dynamics of Stocks and European Options." Entropy 21, no. 8 (August 6, 2019): 765. http://dx.doi.org/10.3390/e21080765.

Full text
Abstract:
We develop an entropic framework to model the dynamics of stocks and European Options. Entropic inference is an inductive inference framework equipped with proper tools to handle situations where incomplete information is available. The objective of the paper is to lay down an alternative framework for modeling dynamics. An important information about the dynamics of a stock’s price is scale invariance. By imposing the scale invariant symmetry, we arrive at choosing the logarithm of the stock’s price as the proper variable to model. The dynamics of stock log price is derived using two pieces of information, the continuity of motion and the directionality constraint. The resulting model is the same as the Geometric Brownian Motion, GBM, of the stock price which is manifestly scale invariant. Furthermore, we come up with the dynamics of probability density function, which is a Fokker–Planck equation. Next, we extend the model to value the European Options on a stock. Derivative securities ought to be prices such that there is no arbitrage. To ensure the no-arbitrage pricing, we derive the risk-neutral measure by incorporating the risk-neutral information. Consequently, the Black–Scholes model and the Black–Scholes-Merton differential equation are derived.
APA, Harvard, Vancouver, ISO, and other styles
25

Ford, Stephen A., Beth Pride Ford, and Thomas H. Spreen. "Evaluation of Alternative Risk Specifications in Farm Programming Models." Agricultural and Resource Economics Review 24, no. 1 (April 1995): 25–35. http://dx.doi.org/10.1017/s1068280500003580.

Full text
Abstract:
The use of alternative probability density functions to specify risk in farm programming models is explored and compared to a traditional specification using historical data. A method is described that compares risk efficient crop mixes using stochastic dominance techniques to examine impacts of different risk specifications on farm plans. Results indicate that a traditional method using historical farm data is as efficient for risk averse producers as two other methods of incorporating risk in farm programming models when evaluated using second degree stochastic dominance. Stochastic dominance with respect to a function further discriminates among the distributions, indicating that a density function based on the historic forecasting accuracy of the futures market results in a more risk-efficient crop mix for highly risk averse producers. Results also illustrate the need to validate alternative risk specifications perceived as improvements to traditional methods.
APA, Harvard, Vancouver, ISO, and other styles
26

CHELLATHURAI, THAMAYANTHI. "PROBABILITY DENSITY OF RECOVERY RATE GIVEN DEFAULT OF A FIRM’S DEBT AND ITS CONSTITUENT TRANCHES." International Journal of Theoretical and Applied Finance 20, no. 04 (April 27, 2017): 1750023. http://dx.doi.org/10.1142/s0219024917500236.

Full text
Abstract:
This paper derives the theoretical underpinnings behind the following observed empirical facts in credit risk modeling: The probability of default, the seniority, the thickness of the tranche, the debt cushion, and macroeconomic factors are the important determinants of the conditional probability density function of the recovery rate given default (RGD) of a firm’s debt and its tranches. In a portfolio of debt securities, the conditional probability density functions of the recovery rate given default of tranches have point probability masses near zero and one, and the expected value of the recovery rate given default increases as the seniority or debt cushion increases. The paper derives other results as well, such as the fact that the conditional probability distribution function associated with any senior tranche dominates that of any junior tranche by first-order. The standard deviation of the recovery rate given default of a senior security need not be greater than that of a junior security. It is proved that the expected value of the recovery rate given default need not increase as the proportional thickness of the tranche increases.
APA, Harvard, Vancouver, ISO, and other styles
27

Basma, Adnan A. "Risk-reduction factor for bearing capacity of shallow foundations." Canadian Geotechnical Journal 31, no. 1 (February 1, 1994): 12–16. http://dx.doi.org/10.1139/t94-002.

Full text
Abstract:
In this paper an ultimate bearing capacity risk-reduction factor is proposed to account for the variation and randomness in soil properties. Through a first-order Taylor series expansion, the mean and variance of the ultimate bearing capacity were assessed. Consequently, the variation of the ultimate bearing capacity is expressed as a function of the variation in cohesion and internal friction angle. To develop a risk-reduction factor, several probability density functions were utilized. The asymptotic type II extreme-value distribution for maxima was found best suited to represent the ultimate bearing capacity. The results indicate that the risk-reduction factor FR decreases with an increase in the coefficient of variation of ultimate bearing capacity and a decrease in the selected probability of failure pf. For pf = 0.0001, however, FR was found to range between 0.20 and 1.0. A numerical example is presented to illustrate the use of the proposed reduction factor. Key words : bearing capacity, coefficient of variation, probability distribution, probability of failure, risk factor, shallow foundations.
APA, Harvard, Vancouver, ISO, and other styles
28

Kim, Yuhee, Jong-chil Park, and Soobong Shin. "Development of a hybrid SHM of cable bridges based on the mixed probability density function." Journal of Civil Structural Health Monitoring 8, no. 4 (August 6, 2018): 569–83. http://dx.doi.org/10.1007/s13349-018-0298-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Dickson, David C. M. "An identity based on the generalised negative binomial distribution with applications in ruin theory." Annals of Actuarial Science 13, no. 2 (September 10, 2018): 308–19. http://dx.doi.org/10.1017/s1748499518000295.

Full text
Abstract:
AbstractIn this study, we show how expressions for the probability of ultimate ruin can be obtained from the probability function of the time of ruin in a particular compound binomial risk model, and from the density of the time of ruin in a particular Sparre Andersen risk model. In each case evaluation of generalised binomial series is required, and the argument of each series has a common form. We evaluate these series by creating an identity based on the generalised negative binomial distribution. We also show how the same ideas apply to the probability function of the number of claims in a particular Sparre Andersen model.
APA, Harvard, Vancouver, ISO, and other styles
30

Silva, Luís M., J. Marques de Sá, and Luís A. Alexandre. "The MEE Principle in Data Classification: A Perceptron-Based Analysis." Neural Computation 22, no. 10 (October 2010): 2698–728. http://dx.doi.org/10.1162/neco_a_00013.

Full text
Abstract:
This letter focuses on the issue of whether risk functionals derived from information-theoretic principles, such as Shannon or Rényi's entropies, are able to cope with the data classification problem in both the sense of attaining the risk functional minimum and implying the minimum probability of error allowed by the family of functions implemented by the classifier, here denoted by min Pe. The analysis of this so-called minimization of error entropy (MEE) principle is carried out in a single perceptron with continuous activation functions, yielding continuous error distributions. In spite of the fact that the analysis is restricted to single perceptrons, it reveals a large spectrum of behaviors that MEE can be expected to exhibit in both theory and practice. In what concerns the theoretical MEE, our study clarifies the role of the parameters controlling the perceptron activation function (of the squashing type) in often reaching the minimum probability of error. Our study also clarifies the role of the kernel density estimator of the error density in achieving the minimum probability of error in practice.
APA, Harvard, Vancouver, ISO, and other styles
31

Zoubeidi, Toufik. "Asymptotic approximations to the Bayes posterior risk." Journal of Applied Mathematics and Stochastic Analysis 3, no. 2 (January 1, 1990): 99–116. http://dx.doi.org/10.1155/s1048953390000090.

Full text
Abstract:
Suppose that, given ω=(ω1,ω2)∈ℜ2, X1,X2,… and Y1,Y2,… are independent random variables and their respective distribution functions Gω1 and Gω2 belong to a one parameter exponential family of distributions. We derive approximations to the posterior probabilities of ω lying in closed convex subsets of the parameter space under a general prior density. Using this, we then approximate the Bayes posterior risk for testing the hypotheses H0:ω∈Ω1 versus H1:ω∈Ω2 using a zero-one loss function, where Ω1 and Ω2 are disjoint closed convex subsets of the parameter space.
APA, Harvard, Vancouver, ISO, and other styles
32

Villarroel-Lamb, Deborah Ann. "INVESTIGATING THE BEACH PREDICTIONS OF A NEW LONG-TERM NUMERICAL MORPHOLOGICAL MODEL – A CARIBBEAN CONTEXT." Coastal Engineering Proceedings 1, no. 33 (December 28, 2012): 127. http://dx.doi.org/10.9753/icce.v33.sediment.127.

Full text
Abstract:
A recently developed beach change model was investigated to assess its predictive capability with respect to shoreline change. This investigation formed part of a number of analyses being conducted to assess the capability of the numerical model. The model was firstly compared to a commonly used commercial model to assess its output on wave and sediment responses. Secondly, the beach changes were investigated to determine a likely probability density function for the shoreline responses. A number of probability density functions were compared with the results and critical deductions were made. Lastly, the new beach change model has a distinctive feature which attempts to reduce the model run-time to promote greater use. This wave-averaging feature was investigated to determine model performance as parameters were changed. It was shown that the model compares favorably to the commercial package in some aspects, but not all. The shoreline response may be best described by a single probability density function, which makes it quite suitable for quantitative risk analyses. Lastly, the wave-averaging feature can be used to reduce runtime although this requires the user to apply sound judgment in the analyses.
APA, Harvard, Vancouver, ISO, and other styles
33

BORMETTI, GIACOMO, VALENTINA CAZZOLA, and DANILO DELPINI. "OPTION PRICING UNDER ORNSTEIN-UHLENBECK STOCHASTIC VOLATILITY: A LINEAR MODEL." International Journal of Theoretical and Applied Finance 13, no. 07 (November 2010): 1047–63. http://dx.doi.org/10.1142/s0219024910006108.

Full text
Abstract:
We consider the problem of option pricing under stochastic volatility models, focusing on the linear approximation of the two processes known as exponential Ornstein-Uhlenbeck and Stein-Stein. Indeed, we show they admit the same limit dynamics in the regime of low fluctuations of the volatility process, under which we derive the exact expression of the characteristic function associated to the risk neutral probability density. This expression allows us to compute option prices exploiting a formula derived by Lewis and Lipton. We analyze in detail the case of Plain Vanilla calls, being liquid instruments for which reliable implied volatility surfaces are available. We also compute the analytical expressions of the first four cumulants, that are crucial to implement a simple two steps calibration procedure. It has been tested against a data set of options traded on the Milan Stock Exchange. The data analysis that we present reveals a good fit with the market implied surfaces and corroborates the accuracy of the linear approximation.
APA, Harvard, Vancouver, ISO, and other styles
34

Hale, Michael, and William Barber. "Evaluation of Vibration References with Equivalent Kurtosis and Dissimilar Amplitude Probability Densities." Journal of the IEST 56, no. 2 (October 1, 2013): 43–56. http://dx.doi.org/10.17764/jiet.56.2.c1j085562l837155.

Full text
Abstract:
The generation of Gaussian noise with a specific auto spectral density (ASD) is a well-documented process employed in drive signal generation in vibration control applications. In recent years, vibration control system vendors have introduced the ability to modify the probability density function (PDF) characteristics associated with the reference ASD, yielding a non-Gaussian drive. The specific parameter defining this process is kurtosis. This paper will discuss concerns with this practice in terms of synthesizing a time history with dissimilar PDF characteristics to that of the measured data upon which the original ASD and kurtosis characteristics were based. An example is discussed from both statistical and fatigue perspectives.
APA, Harvard, Vancouver, ISO, and other styles
35

Kim, Kyu Rang, Mijin Kim, Ho-Seong Choe, Mae Ja Han, Hye-Rim Lee, Jae-Won Oh, and Baek-Jo Kim. "A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function." International Journal of Biometeorology 61, no. 2 (July 7, 2016): 259–72. http://dx.doi.org/10.1007/s00484-016-1208-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Hromadka, T. V., and R. J. Whitely. "Evaluating the effect of land development on sediment transport using a probability density function." Stochastic Hydrology and Hydraulics 7, no. 2 (June 1993): 102–8. http://dx.doi.org/10.1007/bf01581419.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Li, Shuanming, and José Garrido. "On a general class of renewal risk process: analysis of the Gerber-Shiu function." Advances in Applied Probability 37, no. 3 (September 2005): 836–56. http://dx.doi.org/10.1239/aap/1127483750.

Full text
Abstract:
We consider a compound renewal (Sparre Andersen) risk process with interclaim times that have a Kn distribution (i.e. the Laplace transform of their density function is a ratio of two polynomials of degree at most n ∈ N). The Laplace transform of the expected discounted penalty function at ruin is derived. This leads to a generalization of the defective renewal equations given by Willmot (1999) and Gerber and Shiu (2005). Finally, explicit results are given for rationally distributed claim severities.
APA, Harvard, Vancouver, ISO, and other styles
38

Li, Shuanming, and José Garrido. "On a general class of renewal risk process: analysis of the Gerber-Shiu function." Advances in Applied Probability 37, no. 03 (September 2005): 836–56. http://dx.doi.org/10.1017/s0001867800000501.

Full text
Abstract:
We consider a compound renewal (Sparre Andersen) risk process with interclaim times that have a K n distribution (i.e. the Laplace transform of their density function is a ratio of two polynomials of degree at most n ∈ N). The Laplace transform of the expected discounted penalty function at ruin is derived. This leads to a generalization of the defective renewal equations given by Willmot (1999) and Gerber and Shiu (2005). Finally, explicit results are given for rationally distributed claim severities.
APA, Harvard, Vancouver, ISO, and other styles
39

Wang, Qun. "An improved method of estimating Bayes posterior probability density function in reliability data analysis." Microelectronics Reliability 29, no. 5 (January 1989): 757–60. http://dx.doi.org/10.1016/0026-2714(89)90173-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Oya, Masaru. "Relation between mechanism of soil removal from fabrics and a parameter derived from probability density functional method for washing force analysis." Textile Research Journal 89, no. 11 (August 4, 2018): 2236–46. http://dx.doi.org/10.1177/0040517518790978.

Full text
Abstract:
The probability density functional method is one of the washing force analysis methods that combines the classical kinetic analysis of detergency method and risk calculation method using probability density function. This paper discusses the relation between soil removal mechanisms and the value of σ rl, which is one of the two parameters used in the probability density functional method. Four repetitive washing tests were conducted using test fabrics soiled with iron(III) oxide, carbon black, four kinds of water-soluble dyes and three kinds of oily dyes, and the removal (%) was analyzed with the probability density functional method. The results show that the range of σ rl varied with removal mechanisms; mechanical removal of particle soil (0.01–0.6), dissolution into water of water-soluble soil (0.3–1.4), solubilization of oily soil into surfactant micelle (1.0–2.0) and emulsification or dispersion of oily soil (≥3.0). This tendency can be used for estimating the removal mechanism of any washing system where the soil type is unknown.
APA, Harvard, Vancouver, ISO, and other styles
41

Wang, Rui Qing, and Zi Qian Xiao. "GARCHSK Based Risk Assessment in Electric Power Industry." Applied Mechanics and Materials 345 (August 2013): 368–71. http://dx.doi.org/10.4028/www.scientific.net/amm.345.368.

Full text
Abstract:
The restructuring/deregulation in electric power industry has heightened the importance of risk assessment. A model to estimate value-at-risk via GARCHSK specification is proposed, in which the seasonalities, heteroscedasticities, skewnesses, kurtosises and relationship to system loads are jointly addressed. The impacts of probability distribution assumption for innovations on value-at-risk estimate validation are analyzed for three distributions: normal, student-t and Gram-Charlier series expansion of the normal density function. The numerical example shows that the proposed model performs better in predicting one-period-ahead VaR.
APA, Harvard, Vancouver, ISO, and other styles
42

Zhang, Yong, Yan Zhao, Yunyun Lu, and Huajiang Ouyang. "Bayesian identification of bolted-joint parameters using measured power spectral density." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 234, no. 2 (November 29, 2019): 260–74. http://dx.doi.org/10.1177/1748006x19889146.

Full text
Abstract:
A Bayesian method for the optimal estimation of parameters that characterize a bolted joint based on measured power spectral density is proposed in this article. Due to uncertainties such as measurement noise and modelling errors, it is difficult to identify joint parameters of a bolted structure accurately with incomplete measured response data. In this article, using the Bayesian probability framework to describe the uncertainty of the joint parameters and using the power spectrum of the structural response of the single-point/multi-point excitation as measurements, the conditional probability density function of the joint parameters is established. Then, the Bayesian maximum posterior estimation is performed by an optimization method. Two simplified bolted-joint models are built in the numerical examples. First, the feasibility of the proposed method in the undamped model is proved. Then, taking advantage of multi-point excitation, the identification accuracy of the proposed method in the damped model is improved. The numerical results show that the proposed method can accurately identify the stiffness and damping characteristics of joint parameters with good robustness to noise. Finally, the joint parameters of the finite element model for an aero-engine casing are identified by the proposed method with satisfactory accuracy.
APA, Harvard, Vancouver, ISO, and other styles
43

Chowdhury, Shovan, and Asok K. Nanda. "A new lifetime distribution with applications in inventory and insurance." International Journal of Quality & Reliability Management 35, no. 2 (February 5, 2018): 527–44. http://dx.doi.org/10.1108/ijqrm-12-2016-0227.

Full text
Abstract:
Purpose The purpose of this paper is to introduce a new probability density function having both unbounded and bounded support with a wider applicability. While the distribution with bounded support on [0, 1] has applications in insurance and inventory management with ability to fit risk management data on proportions better than existing bounded distributions, the same with unbounded support is used as a lifetime model and is considered as an attractive alternative to some existing models in the reliability literature. Design/methodology/approach The new density function, called modified exponential-geometric distribution is derived from the exponential-geometric distribution introduced by Adamidis and Loukas (1998). The support of the density function is shown to be both unbounded and bounded depending on the values of one of the shape parameters. Various properties of the density function are studied in detail and the parameters are estimated through maximum likelihood method of estimation. A number of applications related to reliability, insurance and inventory management are exhibited along with some useful data analysis. Findings A single probability distribution with both unbounded and bounded support, which does not seem to exist in the reliability literature, is introduced in this paper. The proposed density function exhibits varying shapes including U-shape, and the failure rate also shows increasing, decreasing and bathtub shapes. The Monte Carlo simulation shows that the estimates of the parameters are quite stable with low standard errors. The distribution with unbounded support is shown to have competitive features for lifetime modeling through analysis of two data sets. The distribution with bounded support on [0, 1] is shown to have application in insurance and inventory management and is found to t data on proportions related to risk management better than some existing bounded distributions. Originality/value The authors introduce an innovative probability distribution which contributes significantly in insurance and inventory management besides its remarkable statistical and reliability properties.
APA, Harvard, Vancouver, ISO, and other styles
44

Sarabia, José María, Emilio Gómez-Déniz, Faustino Prieto, and Vanesa Jordá. "AGGREGATION OF DEPENDENT RISKS IN MIXTURES OF EXPONENTIAL DISTRIBUTIONS AND EXTENSIONS." ASTIN Bulletin 48, no. 3 (April 25, 2018): 1079–107. http://dx.doi.org/10.1017/asb.2018.13.

Full text
Abstract:
AbstractThe distribution of the sum of dependent risks is a crucial aspect in actuarial sciences, risk management and in many branches of applied probability. In this paper, we obtain analytic expressions for the probability density function (pdf) and the cumulative distribution function (cdf) of aggregated risks, modelled according to a mixture of exponential distributions. We first review the properties of the multivariate mixture of exponential distributions, to then obtain the analytical formulation for the pdf and the cdf for the aggregated distribution. We study in detail some specific families with Pareto (Sarabia et al., 2016), gamma, Weibull and inverse Gaussian mixture of exponentials (Whitmore and Lee, 1991) claims. We also discuss briefly the computation of risk measures, formulas for the ruin probability (Albrecher et al., 2011) and the collective risk model. An extension of the basic model based on mixtures of gamma distributions is proposed, which is one of the suggested directions for future research.
APA, Harvard, Vancouver, ISO, and other styles
45

Willigen, Durk Van. "Eurofix." Journal of Navigation 42, no. 3 (September 1989): 375–81. http://dx.doi.org/10.1017/s0373463300014661.

Full text
Abstract:
Around 1992 Navstar/GPS will become fully operational. Public access is then provided in the Standard Precision Service, SPS, at a reduced accuracy of 100 metres for 95 per cent of the fixes. The exclusively US military-operated system and the deliberately introduced degradation (SA, selective availability) of the attainable accuracy have some drawbacks for European navigation. Neither the probability density function of the error amplitude, nor its power spectral density function is known. So, using Navstar/GPS as a sole-means precise and reliable navigational aid for high-risk transports is not possible.
APA, Harvard, Vancouver, ISO, and other styles
46

Topolska, Katarzyna. "Determination of risk planning a route with the use of a probability model." AUTOBUSY – Technika, Eksploatacja, Systemy Transportowe 19, no. 6 (September 7, 2018): 266–70. http://dx.doi.org/10.24136/atest.2018.076.

Full text
Abstract:
This paper aims to familiarize readers with notions related to probabilistic methods used for planning routes in telematics systems. The classification task made use of the model based on probabilistic Bayes’ classifier and the probability density function. The first part of the paper describes problems with planning routing in contemporary telematics systems. The second part covers a theoretical basis of classifiers based on hard mathematical methods. If such a model is to make sense, it should account for smaller kinds of risk related to a transport process. This paper presents a method of selecting the most optimal parameters in transport planning. Its author draws attention to the variable reduction method necessary for planning supported by a factor analysis of principal components together with Varimax rotation normalized with Kaiser’s method for quantitative features. The third part is devoted to the process of planning routes and the related risk
APA, Harvard, Vancouver, ISO, and other styles
47

Dawson, Richard, and Jim Hall. "Adaptive importance sampling for risk analysis of complex infrastructure systems." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 462, no. 2075 (May 25, 2006): 3343–62. http://dx.doi.org/10.1098/rspa.2006.1720.

Full text
Abstract:
Complex civil infrastructure systems are typically exposed to random loadings and have a large number of possible failure modes, which often exhibit spatially and temporally variable consequences. Monte Carlo (level III) reliability methods are attractive because of their flexibility and robustness, yet computational expense may be prohibitive, in which case variance reduction methods are required. In the importance sampling methodology presented here, the joint probability distribution of the loading variables is sampled according to the contribution that a given region in the joint space makes to risk, rather than according to probability of failure, which is the conventional importance sampling criterion in structural reliability analysis. Results from simulations are used to intermittently update the importance sampling density function based on the evaluations of the (initially unknown) risk function. The methodology is demonstrated on a propped cantilever beam system and then on a real coastal dike infrastructure system in the UK. The case study demonstrates that risk can be a complex function of loadings, the resistance and interactions of system components and the spatially variable damage associated with different modes of system failure. The methodology is applicable in general to Monte Carlo risk analysis of systems, but it is likely to be most beneficial where consequences of failure are a nonlinear function of load and where system simulation requires significant computational resources.
APA, Harvard, Vancouver, ISO, and other styles
48

Yoo, Yunja, and Tae-Goun Kim. "An Improved Ship Collision Risk Evaluation Method for Korea Maritime Safety Audit Considering Traffic Flow Characteristics." Journal of Marine Science and Engineering 7, no. 12 (December 7, 2019): 448. http://dx.doi.org/10.3390/jmse7120448.

Full text
Abstract:
Ship collision accidents account for the majority of marine accidents. The collision risk can be even greater in ports where the traffic density is high and terrain conditions are difficult. The proximity assessment model of the Korea Maritime Safety Audit (KMSA), which is a tool for improving maritime traffic safety, employs a normal distribution of ship traffic to calculate the ship collision risk. However, ship traffic characteristics can differ according to the characteristics of the sea area and shipping route. Therefore, this study simulates collision probabilities by estimating the best-fit distribution function of ship traffic flow in Ulsan Port, which is the largest hazardous cargo vessel handling port in Korea. A comparison of collision probability simulation results using the best-fit function and the normal distribution function reveals a difference of approximately 1.5–2.4 times for each route. Moreover, the collision probability estimates are not accurate when the normal distribution function is uniformly applied without considering the characteristics of each route. These findings can be used to improve the KMSA evaluation method for ship collision risks, particularly in hazardous port areas.
APA, Harvard, Vancouver, ISO, and other styles
49

Zhang, Zhenhao, Wenbiao Li, and Jianyu Yang. "ANALYSIS OF STOCHASTIC PROCESS TO MODEL SAFETY RISK IN CONSTRUCTION INDUSTRY." JOURNAL OF CIVIL ENGINEERING AND MANAGEMENT 27, no. 2 (February 10, 2021): 87–99. http://dx.doi.org/10.3846/jcem.2021.14108.

Full text
Abstract:
There are many factors leading to construction safety accident. The rule presented under the influence of these factors should be a statistical random rule. To reveal those random rules and study the probability prediction method of construction safety accident, according to stochastic process theory, general stochastic process, Markov process and normal process are respectively used to simulate the risk-accident process in this paper. First, in the general-random-process-based analysis the probability of accidents in a period of time is calculated. Then, the Markov property of the construction safety risk evolution process is illustrated, and the analytical expression of probability density function of first-passage time of Markov-based risk-accident process is derived to calculate the construction safety probability. In the normal-process-based analysis, the construction safety probability formulas in cases of stationary normal risk process and non-stationary normal risk process with zero mean value are derived respectively. Finally, the number of accidents that may occur on construction site in a period is studied macroscopically based on Poisson process, and the probability distribution of time interval between adjacent accidents and the time of the nth accident are calculated respectively. The results provide useful reference for the prediction and management of construction accidents.
APA, Harvard, Vancouver, ISO, and other styles
50

Al-arydah, Mo’tassem. "Population attributable risk associated with lung cancer induced by residential radon in Canada: Sensitivity to relative risk model and radon probability density function choices." Science of The Total Environment 596-597 (October 2017): 331–41. http://dx.doi.org/10.1016/j.scitotenv.2017.04.067.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography