To see the other types of publications on this topic, follow the link: Metropolis-Hastings algoritm.

Journal articles on the topic 'Metropolis-Hastings algoritm'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Metropolis-Hastings algoritm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Delmas, Jean-Françcois, and Benjamin Jourdain. "Does Waste Recycling Really Improve the Multi-Proposal Metropolis–Hastings algorithm? an Analysis Based on Control Variates." Journal of Applied Probability 46, no. 04 (2009): 938–59. http://dx.doi.org/10.1017/s0021900200006069.

Full text
Abstract:
The waste-recycling Monte Carlo (WRMC) algorithm introduced by physicists is a modification of the (multi-proposal) Metropolis–Hastings algorithm, which makes use of all the proposals in the empirical mean, whereas the standard (multi-proposal) Metropolis–Hastings algorithm uses only the accepted proposals. In this paper we extend the WRMC algorithm to a general control variate technique and exhibit the optimal choice of the control variate in terms of the asymptotic variance. We also give an example which shows that, in contradiction to the intuition of physicists, the WRMC algorithm can have
APA, Harvard, Vancouver, ISO, and other styles
2

Delmas, Jean-Françcois, and Benjamin Jourdain. "Does Waste Recycling Really Improve the Multi-Proposal Metropolis–Hastings algorithm? an Analysis Based on Control Variates." Journal of Applied Probability 46, no. 4 (2009): 938–59. http://dx.doi.org/10.1239/jap/1261670681.

Full text
Abstract:
The waste-recycling Monte Carlo (WRMC) algorithm introduced by physicists is a modification of the (multi-proposal) Metropolis–Hastings algorithm, which makes use of all the proposals in the empirical mean, whereas the standard (multi-proposal) Metropolis–Hastings algorithm uses only the accepted proposals. In this paper we extend the WRMC algorithm to a general control variate technique and exhibit the optimal choice of the control variate in terms of the asymptotic variance. We also give an example which shows that, in contradiction to the intuition of physicists, the WRMC algorithm can have
APA, Harvard, Vancouver, ISO, and other styles
3

Hu, Yulin, and Yayong Tang. "Metropolis-Hastings Algorithm with Delayed Acceptance and Rejection." Review of Educational Theory 2, no. 2 (2019): 7. http://dx.doi.org/10.30564/ret.v2i2.682.

Full text
Abstract:
Metropolis-Hastings algorithms are slowed down by the computation of complex target distributions. To solve this problem, one can use the delayed acceptance Metropolis-Hastings algorithm (MHDA) of Christen and Fox (2005). However, the acceptance rate of a proposed value will always be less than in the standard Metropolis-Hastings. We can fix this problem by using the Metropolis-Hastings algorithm with delayed rejection (MHDR) proposed by Tierney and Mira (1999). In this paper, we combine the ideas of MHDA and MHDR to propose a new MH algorithm, named the Metropolis-Hastings algorithm with dela
APA, Harvard, Vancouver, ISO, and other styles
4

Cheon, Soo-Young, and Hee-Chan Lee. "Metropolis-Hastings Expectation Maximization Algorithm for Incomplete Data." Korean Journal of Applied Statistics 25, no. 1 (2012): 183–96. http://dx.doi.org/10.5351/kjas.2012.25.1.183.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chib, Siddhartha, and Edward Greenberg. "Understanding the Metropolis-Hastings Algorithm." American Statistician 49, no. 4 (1995): 327. http://dx.doi.org/10.2307/2684568.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chib, Siddhartha, and Edward Greenberg. "Understanding the Metropolis-Hastings Algorithm." American Statistician 49, no. 4 (1995): 327–35. http://dx.doi.org/10.1080/00031305.1995.10476177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Marnissi, Yosra, Emilie Chouzenoux, Amel Benazza-Benyahia, and Jean-Christophe Pesquet. "Majorize–Minimize Adapted Metropolis–Hastings Algorithm." IEEE Transactions on Signal Processing 68 (2020): 2356–69. http://dx.doi.org/10.1109/tsp.2020.2983150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lemieux, Jessica, Bettina Heim, David Poulin, Krysta Svore, and Matthias Troyer. "Efficient Quantum Walk Circuits for Metropolis-Hastings Algorithm." Quantum 4 (June 29, 2020): 287. http://dx.doi.org/10.22331/q-2020-06-29-287.

Full text
Abstract:
We present a detailed circuit implementation of Szegedy's quantization of the Metropolis-Hastings walk. This quantum walk is usually defined with respect to an oracle. We find that a direct implementation of this oracle requires costly arithmetic operations. We thus reformulate the quantum walk, circumventing its implementation altogether by closely following the classical Metropolis-Hastings walk. We also present heuristic quantum algorithms that use the quantum walk in the context of discrete optimization problems and numerically study their performances. Our numerical results indicate polyn
APA, Harvard, Vancouver, ISO, and other styles
9

Masuhr, Andreas, and Mark Trede. "Bayesian estimation of generalized partition of unity copulas." Dependence Modeling 8, no. 1 (2020): 119–31. http://dx.doi.org/10.1515/demo-2020-0007.

Full text
Abstract:
AbstractThis paper proposes a Bayesian estimation algorithm to estimate Generalized Partition of Unity Copulas (GPUC), a class of nonparametric copulas recently introduced by [18]. The first approach is a random walk Metropolis-Hastings (RW-MH) algorithm, the second one is a random blocking random walk Metropolis-Hastings algorithm (RBRW-MH). Both approaches are Markov chain Monte Carlo methods and can cope with ˛at priors. We carry out simulation studies to determine and compare the efficiency of the algorithms. We present an empirical illustration where GPUCs are used to nonparametrically de
APA, Harvard, Vancouver, ISO, and other styles
10

Saraiva, Erlandson, Adriano Suzuki, and Luis Milan. "Bayesian Computational Methods for Sampling from the Posterior Distribution of a Bivariate Survival Model, Based on AMH Copula in the Presence of Right-Censored Data." Entropy 20, no. 9 (2018): 642. http://dx.doi.org/10.3390/e20090642.

Full text
Abstract:
In this paper, we study the performance of Bayesian computational methods to estimate the parameters of a bivariate survival model based on the Ali–Mikhail–Haq copula with marginal distributions given by Weibull distributions. The estimation procedure was based on Monte Carlo Markov Chain (MCMC) algorithms. We present three version of the Metropolis–Hastings algorithm: Independent Metropolis–Hastings (IMH), Random Walk Metropolis (RWM) and Metropolis–Hastings with a natural-candidate generating density (MH). Since the creation of a good candidate generating density in IMH and RWM may be diffic
APA, Harvard, Vancouver, ISO, and other styles
11

Zuev, K. M., and L. S. Katafygiotis. "Modified Metropolis–Hastings algorithm with delayed rejection." Probabilistic Engineering Mechanics 26, no. 3 (2011): 405–12. http://dx.doi.org/10.1016/j.probengmech.2010.11.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Skold, M., and G. O. Roberts. "Density Estimation for the Metropolis-Hastings Algorithm." Scandinavian Journal of Statistics 30, no. 4 (2003): 699–718. http://dx.doi.org/10.1111/1467-9469.00359.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

HAMMER, HUGO, and HÅKON TJELMELAND. "Control Variates for the Metropolis-Hastings Algorithm." Scandinavian Journal of Statistics 35, no. 3 (2008): 400–414. http://dx.doi.org/10.1111/j.1467-9469.2008.00601.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Hitchcock, David B. "A History of the Metropolis–Hastings Algorithm." American Statistician 57, no. 4 (2003): 254–57. http://dx.doi.org/10.1198/0003130032413.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Seo, Young-Min, and Ki-Bum Park. "Uncertainty Analysis for Parameters of Probability Distribution in Rainfall Frequency Analysis by Bayesian MCMC and Metropolis Hastings Algorithm." Journal of the Environmental Sciences 20, no. 3 (2011): 329–40. http://dx.doi.org/10.5322/jes.2011.20.3.329.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kamatani, K. "Ergodicity of Markov chain Monte Carlo with reversible proposal." Journal of Applied Probability 54, no. 2 (2017): 638–54. http://dx.doi.org/10.1017/jpr.2017.22.

Full text
Abstract:
Abstract We describe the ergodic properties of some Metropolis–Hastings algorithms for heavy-tailed target distributions. The results of these algorithms are usually analyzed under a subgeometric ergodic framework, but we prove that the mixed preconditioned Crank–Nicolson (MpCN) algorithm has geometric ergodicity even for heavy-tailed target distributions. This useful property comes from the fact that, under a suitable transformation, the MpCN algorithm becomes a random-walk Metropolis algorithm.
APA, Harvard, Vancouver, ISO, and other styles
17

Liang, Faming, and Ick-Hoon Jin. "A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants." Neural Computation 25, no. 8 (2013): 2199–234. http://dx.doi.org/10.1162/neco_a_00466.

Full text
Abstract:
Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and e
APA, Harvard, Vancouver, ISO, and other styles
18

Chumbley, Justin R., Karl J. Friston, Tom Fearn, and Stefan J. Kiebel. "A Metropolis–Hastings algorithm for dynamic causal models." NeuroImage 38, no. 3 (2007): 478–87. http://dx.doi.org/10.1016/j.neuroimage.2007.07.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Fort, Gersende, and Eric Moulines. "V-Subgeometric ergodicity for a Hastings–Metropolis algorithm." Statistics & Probability Letters 49, no. 4 (2000): 401–10. http://dx.doi.org/10.1016/s0167-7152(00)00074-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Holden, Lars. "Geometric convergence of the Metropolis-Hastings simulation algorithm." Statistics & Probability Letters 39, no. 4 (1998): 371–77. http://dx.doi.org/10.1016/s0167-7152(98)00096-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Billera, Louis J., and Persi Diaconis. "A Geometric Interpretation of the Metropolis-Hastings Algorithm." Statistical Science 16, no. 4 (2001): 335–39. http://dx.doi.org/10.1214/ss/1015346318.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Gåsemyr, Jørund. "The Spectrum of the Independent Metropolis–Hastings Algorithm." Journal of Theoretical Probability 19, no. 1 (2006): 152–65. http://dx.doi.org/10.1007/s10959-006-0009-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Abushal, Tahani A. "Parametric inference of Akash distribution for Type-Ⅱ censoring with analyzing of relief times of patients." AIMS Mathematics 6, no. 10 (2021): 10789–801. http://dx.doi.org/10.3934/math.2021627.

Full text
Abstract:
<abstract><p>In this paper, the problem of estimating the parameter of Akash distribution applied when the lifetime of the product follow Type-Ⅱ censoring. The maximum likelihood estimators (MLE) are studied for estimating the unknown parameter and reliability characteristics. Approximate confidence interval for the parameter is derived under the s-normal approach to the asymptotic distribution of MLE. The Bayesian inference procedures have been developed under the usual error loss function through Lindley's technique and Metropolis-Hastings algorithm. The highest posterior density
APA, Harvard, Vancouver, ISO, and other styles
24

Izzatullah, Muhammad, Tristan van Leeuwen, and Daniel Peter. "Bayesian seismic inversion: a fast sampling Langevin dynamics Markov chain Monte Carlo method." Geophysical Journal International 227, no. 3 (2021): 1523–53. http://dx.doi.org/10.1093/gji/ggab287.

Full text
Abstract:
SUMMARY In this study, we aim to solve the seismic inversion in the Bayesian framework by generating samples from the posterior distribution. This distribution incorporates the uncertainties in the seismic data, forward model, and prior information about the subsurface model parameters; thus, we obtain more information through sampling than through a point estimate (e.g. maximum a posteriori method). Based on the numerical cost of solving the forward problem and the dimensions of the subsurface model parameters and observed data, sampling with Markov chain Monte Carlo (MCMC) algorithms can be
APA, Harvard, Vancouver, ISO, and other styles
25

Dahme, O. "RooMCMarkovChain a Metropolis--Hastings Algorithm for the Root Framework." Acta Physica Polonica B 49, no. 6 (2018): 1097. http://dx.doi.org/10.5506/aphyspolb.49.1097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Luo, Xin, and Håkon Tjelmeland. "A multiple-try Metropolis–Hastings algorithm with tailored proposals." Computational Statistics 34, no. 3 (2019): 1109–33. http://dx.doi.org/10.1007/s00180-019-00878-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Müller, Christian, Fabian Weysser, Thomas Mrziglod, and Andreas Schuppert. "Markov-Chain Monte-Carlo methods and non-identifiabilities." Monte Carlo Methods and Applications 24, no. 3 (2018): 203–14. http://dx.doi.org/10.1515/mcma-2018-0018.

Full text
Abstract:
Abstract We consider the problem of sampling from high-dimensional likelihood functions with large amounts of non-identifiabilities via Markov-Chain Monte-Carlo algorithms. Non-identifiabilities are problematic for commonly used proposal densities, leading to a low effective sample size. To address this problem, we introduce a regularization method using an artificial prior, which restricts non-identifiable parts of the likelihood function. This enables us to sample the posterior using common MCMC methods more efficiently. We demonstrate this with three MCMC methods on a likelihood based on a
APA, Harvard, Vancouver, ISO, and other styles
28

Dunson, D. B., and J. E. Johndrow. "The Hastings algorithm at fifty." Biometrika 107, no. 1 (2019): 1–23. http://dx.doi.org/10.1093/biomet/asz066.

Full text
Abstract:
Summary In a 1970 Biometrika paper, W. K. Hastings developed a broad class of Markov chain algorithms for sampling from probability distributions that are difficult to sample from directly. The algorithm draws a candidate value from a proposal distribution and accepts the candidate with a probability that can be computed using only the unnormalized density of the target distribution, allowing one to sample from distributions known only up to a constant of proportionality. The stationary distribution of the corresponding Markov chain is the target distribution one is attempting to sample from.
APA, Harvard, Vancouver, ISO, and other styles
29

Li, Qianyun, Faliu Yi, Tao Wang, Guanghua Xiao, and Faming Liang. "Lung Cancer Pathological Image Analysis Using a Hidden Potts Model." Cancer Informatics 16 (January 1, 2017): 117693511771191. http://dx.doi.org/10.1177/1176935117711910.

Full text
Abstract:
Nowadays, many biological data are acquired via images. In this article, we study the pathological images scanned from 205 patients with lung cancer with the goal to find out the relationship between the survival time and the spatial distribution of different types of cells, including lymphocyte, stroma, and tumor cells. Toward this goal, we model the spatial distribution of different types of cells using a modified Potts model for which the parameters represent interactions between different types of cells and estimate the parameters of the Potts model using the double Metropolis-Hastings alg
APA, Harvard, Vancouver, ISO, and other styles
30

El Chamie, Mahmoud, and Behçet Açıkmeşe. "Safe Metropolis–Hastings algorithm and its application to swarm control." Systems & Control Letters 111 (January 2018): 40–48. http://dx.doi.org/10.1016/j.sysconle.2017.10.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Bachoc, Francois, Achref Bachouch, and Lionel Lenôtre. "Hastings-Metropolis algorithm on Markov chains for small-probability estimation." ESAIM: Proceedings and Surveys 48 (January 2015): 276–307. http://dx.doi.org/10.1051/proc/201448013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Chauveau, Didier, and Pierre Vandekerkhove. "Smoothness of Metropolis-Hastings algorithm and application to entropy estimation." ESAIM: Probability and Statistics 17 (2013): 419–31. http://dx.doi.org/10.1051/ps/2012004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Cai, Li. "Metropolis-Hastings Robbins-Monro Algorithm for Confirmatory Item Factor Analysis." Journal of Educational and Behavioral Statistics 35, no. 3 (2010): 307–35. http://dx.doi.org/10.3102/1076998609353115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Geweke, John, and Hisashi Tanizaki. "Note on the Sampling Distribution for the Metropolis-Hastings Algorithm." Communications in Statistics - Theory and Methods 32, no. 4 (2003): 775–89. http://dx.doi.org/10.1081/sta-120018828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Kumar, R. Ashok, and K. Ganesan. "Video segmentation using Metropolis Hastings Algorithm for the VCR operations." International Journal of Advanced Media and Communication 4, no. 3 (2010): 274. http://dx.doi.org/10.1504/ijamc.2010.034661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Walker, Stephen G. "Sampling Unnormalized Probabilities: An Alternative to the Metropolis--Hastings Algorithm." SIAM Journal on Scientific Computing 36, no. 2 (2014): A482—A494. http://dx.doi.org/10.1137/130922549.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Hairer, Martin, Andrew M. Stuart, and Sebastian J. Vollmer. "Spectral gaps for a Metropolis–Hastings algorithm in infinite dimensions." Annals of Applied Probability 24, no. 6 (2014): 2455–90. http://dx.doi.org/10.1214/13-aap982.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Roberts, G. O. "A note on acceptance rate criteria for CLTS for Metropolis–Hastings algorithms." Journal of Applied Probability 36, no. 04 (1999): 1210–17. http://dx.doi.org/10.1017/s0021900200017976.

Full text
Abstract:
This paper considers positive recurrent Markov chains where the probability of remaining in the current state is arbitrarily close to 1. Specifically, conditions are given which ensure the non-existence of central limit theorems for ergodic averages of functionals of the chain. The results are motivated by applications for Metropolis–Hastings algorithms which are constructed in terms of a rejection probability (where a rejection involves remaining at the current state). Two examples for commonly used algorithms are given, for the independence sampler and the Metropolis-adjusted Langevin algori
APA, Harvard, Vancouver, ISO, and other styles
39

Roberts, G. O. "A note on acceptance rate criteria for CLTS for Metropolis–Hastings algorithms." Journal of Applied Probability 36, no. 4 (1999): 1210–17. http://dx.doi.org/10.1239/jap/1032374766.

Full text
Abstract:
This paper considers positive recurrent Markov chains where the probability of remaining in the current state is arbitrarily close to 1. Specifically, conditions are given which ensure the non-existence of central limit theorems for ergodic averages of functionals of the chain. The results are motivated by applications for Metropolis–Hastings algorithms which are constructed in terms of a rejection probability (where a rejection involves remaining at the current state). Two examples for commonly used algorithms are given, for the independence sampler and the Metropolis-adjusted Langevin algori
APA, Harvard, Vancouver, ISO, and other styles
40

CHAUVEAU, DIDIER, and PIERRE VANDEKERKHOVE. "Improving Convergence of the Hastings-Metropolis Algorithm with an Adaptive Proposal." Scandinavian Journal of Statistics 29, no. 1 (2002): 13–29. http://dx.doi.org/10.1111/1467-9469.00064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Chattopadhyay, Tanuka, Asis Kumar Chattopadhyay, and Abisa Sinha. "MODELING OF THE INITIAL MASS FUNCTION USING THE METROPOLIS−HASTINGS ALGORITHM." Astrophysical Journal 736, no. 2 (2011): 152. http://dx.doi.org/10.1088/0004-637x/736/2/152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Liang, Faming, Jinsu Kim, and Qifan Song. "A Bootstrap Metropolis–Hastings Algorithm for Bayesian Analysis of Big Data." Technometrics 58, no. 3 (2016): 304–18. http://dx.doi.org/10.1080/00401706.2016.1142905.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Baffoun, Hatem, Mekki Hajlaoui, and Abdeljelil Farhat. "Equation-solving estimator based on Metropolis-Hastings algorithm with delayed rejection." Communications in Statistics - Simulation and Computation 46, no. 4 (2016): 3103–11. http://dx.doi.org/10.1080/03610918.2015.1073306.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Maier, Kimberly S. "A Rasch Hierarchical Measurement Model." Journal of Educational and Behavioral Statistics 26, no. 3 (2001): 307–30. http://dx.doi.org/10.3102/10769986026003307.

Full text
Abstract:
In this article, a hierarchical measurement model is developed that enables researchers to measure a latent trait variable and model the error variance corresponding to multiple levels. The Rasch hierarchical measurement model (HMM) results when a Rasch IRT model and a one-way ANOVA with random effects are combined ( Bryk & Raudenbush, 1992 ; Goldstein, 1987 ; Rasch, 1960 ). This model is appropriate for modeling dichotomous response strings nested within a contextual level. Examples of this type of structure include responses from students nested within schools and multiple response strin
APA, Harvard, Vancouver, ISO, and other styles
45

Müller, Christian, Holger Diedam, Thomas Mrziglod, and Andreas Schuppert. "A neural network assisted Metropolis adjusted Langevin algorithm." Monte Carlo Methods and Applications 26, no. 2 (2020): 93–111. http://dx.doi.org/10.1515/mcma-2020-2060.

Full text
Abstract:
AbstractIn this paper, we derive a Markov chain Monte Carlo (MCMC) algorithm supported by a neural network. In particular, we use the neural network to substitute derivative calculations made during a Metropolis adjusted Langevin algorithm (MALA) step with inexpensive neural network evaluations. Using a complex, high-dimensional blood coagulation model and a set of measurements, we define a likelihood function on which we evaluate the new MCMC algorithm. The blood coagulation model is a dynamic model, where derivative calculations are expensive and hence limit the efficiency of derivative-base
APA, Harvard, Vancouver, ISO, and other styles
46

HENSHALL, JOHN M., and BRUCE TIER. "An algorithm for sampling descent graphs in large complex pedigrees efficiently." Genetical Research 81, no. 3 (2003): 205–12. http://dx.doi.org/10.1017/s0016672303006232.

Full text
Abstract:
No exact method for determining genotypic and identity-by-descent probabilities is available for large complex pedigrees. Approximate methods for such pedigrees cannot be guaranteed to be unbiased. A new method is proposed that uses the Metropolis–Hastings algorithm to sample a Markov chain of descent graphs which fit the pedigree and known genotypes. Unknown genotypes are determined from each descent graph. Genotypic probabilities are estimated as their means. The algorithm is shown to be unbiased for small complex pedigrees and feasible and consistent for moderately large complex pedigrees.
APA, Harvard, Vancouver, ISO, and other styles
47

Livingstone, Samuel. "Geometric Ergodicity of the Random Walk Metropolis with Position-Dependent Proposal Covariance." Mathematics 9, no. 4 (2021): 341. http://dx.doi.org/10.3390/math9040341.

Full text
Abstract:
We consider a Metropolis–Hastings method with proposal N(x,hG(x)−1), where x is the current state, and study its ergodicity properties. We show that suitable choices of G(x) can change these ergodicity properties compared to the Random Walk Metropolis case N(x,hΣ), either for better or worse. We find that if the proposal variance is allowed to grow unboundedly in the tails of the distribution then geometric ergodicity can be established when the target distribution for the algorithm has tails that are heavier than exponential, in contrast to the Random Walk Metropolis case, but that the growth
APA, Harvard, Vancouver, ISO, and other styles
48

Heckman, Jonathan J., Jeffrey G. Bernstein, and Ben Vigoda. "MCMC with strings and branes: The suburban algorithm (Extended Version)." International Journal of Modern Physics A 32, no. 22 (2017): 1750133. http://dx.doi.org/10.1142/s0217751x17501330.

Full text
Abstract:
Motivated by the physics of strings and branes, we develop a class of Markov chain Monte Carlo (MCMC) algorithms involving extended objects. Starting from a collection of parallel Metropolis–Hastings (MH) samplers, we place them on an auxiliary grid, and couple them together via nearest neighbor interactions. This leads to a class of “suburban samplers” (i.e. spread out Metropolis). Coupling the samplers in this way modifies the mixing rate and speed of convergence for the Markov chain, and can in many cases allow a sampler to more easily overcome free energy barriers in a target distribution.
APA, Harvard, Vancouver, ISO, and other styles
49

Liu, Qing, David Pitt, Xibin Zhang, and Xueyuan Wu. "A Bayesian Approach to Parameter Estimation for Kernel Density Estimation via Transformations." Annals of Actuarial Science 5, no. 2 (2011): 181–93. http://dx.doi.org/10.1017/s1748499511000030.

Full text
Abstract:
AbstractIn this paper, we present a Markov chain Monte Carlo (MCMC) simulation algorithm for estimating parameters in the kernel density estimation of bivariate insurance claim data via transformations. Our data set consists of two types of auto insurance claim costs and exhibits a high-level of skewness in the marginal empirical distributions. Therefore, the kernel density estimator based on original data does not perform well. However, the density of the original data can be estimated through estimating the density of the transformed data using kernels. It is well known that the performance
APA, Harvard, Vancouver, ISO, and other styles
50

Santoso, A. M., K. K. Phoon, and S. T. Quek. "Modified Metropolis–Hastings algorithm with reduced chain correlation for efficient subset simulation." Probabilistic Engineering Mechanics 26, no. 2 (2011): 331–41. http://dx.doi.org/10.1016/j.probengmech.2010.08.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!