To see the other types of publications on this topic, follow the link: Algorithme Metropolis.

Journal articles on the topic 'Algorithme Metropolis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Algorithme Metropolis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Schnetzler, Bernard. "Un algorithme dérivé de l'algorithme de Metropolis." Comptes Rendus Mathematique 355, no. 10 (2017): 1104–10. http://dx.doi.org/10.1016/j.crma.2017.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chauveau, Didier, and Pierre Vandekerkhove. "Un algorithme de Hastings-Metropolis avec apprentissage séquentiel." Comptes Rendus de l'Académie des Sciences - Series I - Mathematics 329, no. 2 (1999): 173–76. http://dx.doi.org/10.1016/s0764-4442(99)80484-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kamatani, K. "Ergodicity of Markov chain Monte Carlo with reversible proposal." Journal of Applied Probability 54, no. 2 (2017): 638–54. http://dx.doi.org/10.1017/jpr.2017.22.

Full text
Abstract:
Abstract We describe the ergodic properties of some Metropolis–Hastings algorithms for heavy-tailed target distributions. The results of these algorithms are usually analyzed under a subgeometric ergodic framework, but we prove that the mixed preconditioned Crank–Nicolson (MpCN) algorithm has geometric ergodicity even for heavy-tailed target distributions. This useful property comes from the fact that, under a suitable transformation, the MpCN algorithm becomes a random-walk Metropolis algorithm.
APA, Harvard, Vancouver, ISO, and other styles
4

Hu, Yulin, and Yayong Tang. "Metropolis-Hastings Algorithm with Delayed Acceptance and Rejection." Review of Educational Theory 2, no. 2 (2019): 7. http://dx.doi.org/10.30564/ret.v2i2.682.

Full text
Abstract:
Metropolis-Hastings algorithms are slowed down by the computation of complex target distributions. To solve this problem, one can use the delayed acceptance Metropolis-Hastings algorithm (MHDA) of Christen and Fox (2005). However, the acceptance rate of a proposed value will always be less than in the standard Metropolis-Hastings. We can fix this problem by using the Metropolis-Hastings algorithm with delayed rejection (MHDR) proposed by Tierney and Mira (1999). In this paper, we combine the ideas of MHDA and MHDR to propose a new MH algorithm, named the Metropolis-Hastings algorithm with delayed acceptance and rejection (MHDAR). The new algorithm reduces the computational cost by division of the prior or likelihood functions and increase the acceptance probability by delay rejection of the second stage. We illustrate those accelerating features by a realistic example.
APA, Harvard, Vancouver, ISO, and other styles
5

Müller, Christian, Holger Diedam, Thomas Mrziglod, and Andreas Schuppert. "A neural network assisted Metropolis adjusted Langevin algorithm." Monte Carlo Methods and Applications 26, no. 2 (2020): 93–111. http://dx.doi.org/10.1515/mcma-2020-2060.

Full text
Abstract:
AbstractIn this paper, we derive a Markov chain Monte Carlo (MCMC) algorithm supported by a neural network. In particular, we use the neural network to substitute derivative calculations made during a Metropolis adjusted Langevin algorithm (MALA) step with inexpensive neural network evaluations. Using a complex, high-dimensional blood coagulation model and a set of measurements, we define a likelihood function on which we evaluate the new MCMC algorithm. The blood coagulation model is a dynamic model, where derivative calculations are expensive and hence limit the efficiency of derivative-based MCMC algorithms. The MALA adaptation greatly reduces the time per iteration, while only slightly affecting the sample quality. We also test the new algorithm on a 2-dimensional example with a non-convex shape, a case where the MALA algorithm has a clear advantage over other state of the art MCMC algorithms. To assess the impact of the new algorithm, we compare the results to previously generated results of the MALA and the random walk Metropolis Hastings (RWMH).
APA, Harvard, Vancouver, ISO, and other styles
6

Roberts, Gareth O., and Jeffrey S. Rosenthal. "Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits." Journal of Applied Probability 53, no. 2 (2016): 410–20. http://dx.doi.org/10.1017/jpr.2016.9.

Full text
Abstract:
Abstract We connect known results about diffusion limits of Markov chain Monte Carlo (MCMC) algorithms to the computer science notion of algorithm complexity. Our main result states that any weak limit of a Markov process implies a corresponding complexity bound (in an appropriate metric). We then combine this result with previously-known MCMC diffusion limit results to prove that under appropriate assumptions, the random-walk Metropolis algorithm in d dimensions takes O(d) iterations to converge to stationarity, while the Metropolis-adjusted Langevin algorithm takes O(d1/3) iterations to converge to stationarity.
APA, Harvard, Vancouver, ISO, and other styles
7

LIMA, F. W. S. "MIXED ALGORITHMS IN THE ISING MODEL ON DIRECTED BARABÁSI–ALBERT NETWORKS." International Journal of Modern Physics C 17, no. 06 (2006): 785–93. http://dx.doi.org/10.1142/s0129183106008753.

Full text
Abstract:
On directed Barabási–Albert networks with two and seven neighbours selected by each added site, the Ising model does not seem to show a spontaneous magnetisation. Instead, the decay time for flipping of the magnetisation follows an Arrhenius law for Metropolis and Glauber algorithms, but for Wolff cluster flipping the magnetisation decays exponentially with time. On these networks the magnetisation behaviour of the Ising model, with Glauber, HeatBath, Metropolis, Wolf or Swendsen–Wang algorithm competing against Kawasaki dynamics, is studied by Monte Carlo simulations. We show that the model exhibits the phenomenon of self-organisation (= stationary equilibrium) defined in Ref. 8 when Kawasaki dynamics is not dominant in its competition with Glauber, HeatBath and Swendsen–Wang algorithms. Only for Wolff cluster flipping the magnetisation, this phenomenon occurs after an exponentially decay of magnetisation with time. The Metropolis results are independent of competition. We also study the same process of competition described above but with Kawasaki dynamics at the same temperature as the other algorithms. The obtained results are similar for Wolff cluster flipping, Metropolis and Swendsen–Wang algorithms but different for HeatBath.
APA, Harvard, Vancouver, ISO, and other styles
8

Liang, Faming, and Ick-Hoon Jin. "A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants." Neural Computation 25, no. 8 (2013): 2199–234. http://dx.doi.org/10.1162/neco_a_00466.

Full text
Abstract:
Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. The MCMH algorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals.
APA, Harvard, Vancouver, ISO, and other styles
9

Roberts, G. O. "A note on acceptance rate criteria for CLTS for Metropolis–Hastings algorithms." Journal of Applied Probability 36, no. 04 (1999): 1210–17. http://dx.doi.org/10.1017/s0021900200017976.

Full text
Abstract:
This paper considers positive recurrent Markov chains where the probability of remaining in the current state is arbitrarily close to 1. Specifically, conditions are given which ensure the non-existence of central limit theorems for ergodic averages of functionals of the chain. The results are motivated by applications for Metropolis–Hastings algorithms which are constructed in terms of a rejection probability (where a rejection involves remaining at the current state). Two examples for commonly used algorithms are given, for the independence sampler and the Metropolis-adjusted Langevin algorithm. The examples are rather specialized, although, in both cases, the problems which arise are typical of problems commonly occurring for the particular algorithm being used.
APA, Harvard, Vancouver, ISO, and other styles
10

Roberts, G. O. "A note on acceptance rate criteria for CLTS for Metropolis–Hastings algorithms." Journal of Applied Probability 36, no. 4 (1999): 1210–17. http://dx.doi.org/10.1239/jap/1032374766.

Full text
Abstract:
This paper considers positive recurrent Markov chains where the probability of remaining in the current state is arbitrarily close to 1. Specifically, conditions are given which ensure the non-existence of central limit theorems for ergodic averages of functionals of the chain. The results are motivated by applications for Metropolis–Hastings algorithms which are constructed in terms of a rejection probability (where a rejection involves remaining at the current state). Two examples for commonly used algorithms are given, for the independence sampler and the Metropolis-adjusted Langevin algorithm. The examples are rather specialized, although, in both cases, the problems which arise are typical of problems commonly occurring for the particular algorithm being used.
APA, Harvard, Vancouver, ISO, and other styles
11

Masuhr, Andreas, and Mark Trede. "Bayesian estimation of generalized partition of unity copulas." Dependence Modeling 8, no. 1 (2020): 119–31. http://dx.doi.org/10.1515/demo-2020-0007.

Full text
Abstract:
AbstractThis paper proposes a Bayesian estimation algorithm to estimate Generalized Partition of Unity Copulas (GPUC), a class of nonparametric copulas recently introduced by [18]. The first approach is a random walk Metropolis-Hastings (RW-MH) algorithm, the second one is a random blocking random walk Metropolis-Hastings algorithm (RBRW-MH). Both approaches are Markov chain Monte Carlo methods and can cope with ˛at priors. We carry out simulation studies to determine and compare the efficiency of the algorithms. We present an empirical illustration where GPUCs are used to nonparametrically describe the dependence of exchange rate changes of the crypto-currencies Bitcoin and Ethereum.
APA, Harvard, Vancouver, ISO, and other styles
12

Shao, Wei, and Guangbao Guo. "Multiple-Try Simulated Annealing Algorithm for Global Optimization." Mathematical Problems in Engineering 2018 (July 17, 2018): 1–11. http://dx.doi.org/10.1155/2018/9248318.

Full text
Abstract:
Simulated annealing is a widely used algorithm for the computation of global optimization problems in computational chemistry and industrial engineering. However, global optimum values cannot always be reached by simulated annealing without a logarithmic cooling schedule. In this study, we propose a new stochastic optimization algorithm, i.e., simulated annealing based on the multiple-try Metropolis method, which combines simulated annealing and the multiple-try Metropolis algorithm. The proposed algorithm functions with a rapidly decreasing schedule, while guaranteeing global optimum values. Simulated and real data experiments including a mixture normal model and nonlinear Bayesian model indicate that the proposed algorithm can significantly outperform other approximated algorithms, including simulated annealing and the quasi-Newton method.
APA, Harvard, Vancouver, ISO, and other styles
13

Bhanot, G. "The Metropolis algorithm." Reports on Progress in Physics 51, no. 3 (1988): 429–57. http://dx.doi.org/10.1088/0034-4885/51/3/003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Beichl, I., and F. Sullivan. "The Metropolis Algorithm." Computing in Science & Engineering 2, no. 1 (2000): 65–69. http://dx.doi.org/10.1109/5992.814660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Mengshoel, Ole J., and David E. Goldberg. "The Crowding Approach to Niching in Genetic Algorithms." Evolutionary Computation 16, no. 3 (2008): 315–54. http://dx.doi.org/10.1162/evco.2008.16.3.315.

Full text
Abstract:
A wide range of niching techniques have been investigated in evolutionary and genetic algorithms. In this article, we focus on niching using crowding techniques in the context of what we call local tournament algorithms. In addition to deterministic and probabilistic crowding, the family of local tournament algorithms includes the Metropolis algorithm, simulated annealing, restricted tournament selection, and parallel recombinative simulated annealing. We describe an algorithmic and analytical framework which is applicable to a wide range of crowding algorithms. As an example of utilizing this framework, we present and analyze the probabilistic crowding niching algorithm. Like the closely related deterministic crowding approach, probabilistic crowding is fast, simple, and requires no parameters beyond those of classical genetic algorithms. In probabilistic crowding, subpopulations are maintained reliably, and we show that it is possible to analyze and predict how this maintenance takes place. We also provide novel results for deterministic crowding, show how different crowding replacement rules can be combined in portfolios, and discuss population sizing. Our analysis is backed up by experiments that further increase the understanding of probabilistic crowding.
APA, Harvard, Vancouver, ISO, and other styles
16

Saraiva, Erlandson, Adriano Suzuki, and Luis Milan. "Bayesian Computational Methods for Sampling from the Posterior Distribution of a Bivariate Survival Model, Based on AMH Copula in the Presence of Right-Censored Data." Entropy 20, no. 9 (2018): 642. http://dx.doi.org/10.3390/e20090642.

Full text
Abstract:
In this paper, we study the performance of Bayesian computational methods to estimate the parameters of a bivariate survival model based on the Ali–Mikhail–Haq copula with marginal distributions given by Weibull distributions. The estimation procedure was based on Monte Carlo Markov Chain (MCMC) algorithms. We present three version of the Metropolis–Hastings algorithm: Independent Metropolis–Hastings (IMH), Random Walk Metropolis (RWM) and Metropolis–Hastings with a natural-candidate generating density (MH). Since the creation of a good candidate generating density in IMH and RWM may be difficult, we also describe how to update a parameter of interest using the slice sampling (SS) method. A simulation study was carried out to compare the performances of the IMH, RWM and SS. A comparison was made using the sample root mean square error as an indicator of performance. Results obtained from the simulations show that the SS algorithm is an effective alternative to the IMH and RWM methods when simulating values from the posterior distribution, especially for small sample sizes. We also applied these methods to a real data set.
APA, Harvard, Vancouver, ISO, and other styles
17

Haario, Heikki, Eero Saksman, and Johanna Tamminen. "An Adaptive Metropolis Algorithm." Bernoulli 7, no. 2 (2001): 223. http://dx.doi.org/10.2307/3318737.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Delmas, Jean-Françcois, and Benjamin Jourdain. "Does Waste Recycling Really Improve the Multi-Proposal Metropolis–Hastings algorithm? an Analysis Based on Control Variates." Journal of Applied Probability 46, no. 04 (2009): 938–59. http://dx.doi.org/10.1017/s0021900200006069.

Full text
Abstract:
The waste-recycling Monte Carlo (WRMC) algorithm introduced by physicists is a modification of the (multi-proposal) Metropolis–Hastings algorithm, which makes use of all the proposals in the empirical mean, whereas the standard (multi-proposal) Metropolis–Hastings algorithm uses only the accepted proposals. In this paper we extend the WRMC algorithm to a general control variate technique and exhibit the optimal choice of the control variate in terms of the asymptotic variance. We also give an example which shows that, in contradiction to the intuition of physicists, the WRMC algorithm can have an asymptotic variance larger than that of the Metropolis–Hastings algorithm. However, in the particular case of the Metropolis–Hastings algorithm called the Boltzmann algorithm, we prove that the WRMC algorithm is asymptotically better than the Metropolis–Hastings algorithm. This last property is also true for the multi-proposal Metropolis–Hastings algorithm. In this last framework we consider a linear parametric generalization of WRMC, and we propose an estimator of the explicit optimal parameter using the proposals.
APA, Harvard, Vancouver, ISO, and other styles
19

Delmas, Jean-Françcois, and Benjamin Jourdain. "Does Waste Recycling Really Improve the Multi-Proposal Metropolis–Hastings algorithm? an Analysis Based on Control Variates." Journal of Applied Probability 46, no. 4 (2009): 938–59. http://dx.doi.org/10.1239/jap/1261670681.

Full text
Abstract:
The waste-recycling Monte Carlo (WRMC) algorithm introduced by physicists is a modification of the (multi-proposal) Metropolis–Hastings algorithm, which makes use of all the proposals in the empirical mean, whereas the standard (multi-proposal) Metropolis–Hastings algorithm uses only the accepted proposals. In this paper we extend the WRMC algorithm to a general control variate technique and exhibit the optimal choice of the control variate in terms of the asymptotic variance. We also give an example which shows that, in contradiction to the intuition of physicists, the WRMC algorithm can have an asymptotic variance larger than that of the Metropolis–Hastings algorithm. However, in the particular case of the Metropolis–Hastings algorithm called the Boltzmann algorithm, we prove that the WRMC algorithm is asymptotically better than the Metropolis–Hastings algorithm. This last property is also true for the multi-proposal Metropolis–Hastings algorithm. In this last framework we consider a linear parametric generalization of WRMC, and we propose an estimator of the explicit optimal parameter using the proposals.
APA, Harvard, Vancouver, ISO, and other styles
20

BAILLIE, CLIVE F. "LATTICE SPIN MODELS AND NEW ALGORITHMS — A REVIEW OF MONTE CARLO COMPUTER SIMULATIONS." International Journal of Modern Physics C 01, no. 01 (1990): 91–117. http://dx.doi.org/10.1142/s0129183190000050.

Full text
Abstract:
We review Monte Carlo computer simulations of spin models — both discrete and continuous. We explain the phenomenon of critical slowing which seriously degrades the efficiency of standard local Monte Carlo algorithms such as the Metropolis algorithm near phase transitions. We then go onto describe in detail the new algorithms which ameliorate the problem of critical slowing down, and give their dynamical critical exponent values.
APA, Harvard, Vancouver, ISO, and other styles
21

Jarner, Søren F., and Wai Kong Yuen. "Conductance bounds on the L 2 convergence rate of Metropolis algorithms on unbounded state spaces." Advances in Applied Probability 36, no. 01 (2004): 243–66. http://dx.doi.org/10.1017/s0001867800012957.

Full text
Abstract:
In this paper we derive bounds on the conductance and hence on the spectral gap of a Metropolis algorithm with a monotone, log-concave target density on an interval of ℝ. We show that the minimal conductance set has measure ½ and we use this characterization to bound the conductance in terms of the conductance of the algorithm restricted to a smaller domain. Whereas previous work on conductance has resulted in good bounds for Markov chains on bounded domains, this is the first conductance bound applicable to unbounded domains. We then show how this result can be combined with the state-decomposition theorem of Madras and Randall (2002) to bound the spectral gap of Metropolis algorithms with target distributions with monotone, log-concave tails on ℝ.
APA, Harvard, Vancouver, ISO, and other styles
22

Jarner, Søren F., and Wai Kong Yuen. "Conductance bounds on the L2 convergence rate of Metropolis algorithms on unbounded state spaces." Advances in Applied Probability 36, no. 1 (2004): 243–66. http://dx.doi.org/10.1239/aap/1077134472.

Full text
Abstract:
In this paper we derive bounds on the conductance and hence on the spectral gap of a Metropolis algorithm with a monotone, log-concave target density on an interval of ℝ. We show that the minimal conductance set has measure ½ and we use this characterization to bound the conductance in terms of the conductance of the algorithm restricted to a smaller domain. Whereas previous work on conductance has resulted in good bounds for Markov chains on bounded domains, this is the first conductance bound applicable to unbounded domains. We then show how this result can be combined with the state-decomposition theorem of Madras and Randall (2002) to bound the spectral gap of Metropolis algorithms with target distributions with monotone, log-concave tails on ℝ.
APA, Harvard, Vancouver, ISO, and other styles
23

Pawig, S. Große, and K. Pinn. "Monte Carlo Algorithms for the Fully Frustrated XY Model." International Journal of Modern Physics C 09, no. 05 (1998): 727–36. http://dx.doi.org/10.1142/s0129183198000637.

Full text
Abstract:
We investigate local update algorithms for the fully frustrated XY model on a square lattice. In addition to the standard updating procedures like the Metropolis or heat bath algorithm we include overrelaxation sweeps, implemented through single spin updates that preserve the energy of the configuration. The dynamical critical exponent (of order two) stays more or less unchanged. However, the integrated autocorrelation times of the algorithm can be significantly reduced.
APA, Harvard, Vancouver, ISO, and other styles
24

Jarner, Søren Fiig, and Ernst Hansen. "Geometric ergodicity of Metropolis algorithms." Stochastic Processes and their Applications 85, no. 2 (2000): 341–61. http://dx.doi.org/10.1016/s0304-4149(99)00082-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Nemeth, Christopher, Chris Sherlock, and Paul Fearnhead. "Particle Metropolis-adjusted Langevin algorithms." Biometrika 103, no. 3 (2016): 701–17. http://dx.doi.org/10.1093/biomet/asw020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Eidsvik, Jo, and HåKon Tjelmeland. "On directional Metropolis–Hastings algorithms." Statistics and Computing 16, no. 1 (2006): 93–106. http://dx.doi.org/10.1007/s11222-006-5536-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Baffoun, Hatem, Mekki Hajlaoui, and Abdeljelil Farhat. "On Estimating Non-standard Discrete Distributions Using Adaptive MCMC Methods." International Journal of Statistics and Probability 7, no. 3 (2018): 1. http://dx.doi.org/10.5539/ijsp.v7n3p1.

Full text
Abstract:
In this paper, we compare empirically the performance of some adaptive MCMC methods, that is, Adaptive Metropolis (AM) algorithm, Single Component Adaptive Metropolis (SCAM) algorithm and Delayed Rejection Adaptive Metropolis (DRAM) algorithm. The context is the simulation of non-standard discrete distributions. The performance criterion used is the precision of the frequency estimator. An application to a Bayesian hypothesis testing problem shows the superiority of the DRAM algorithm over the other considered sampling schemes.
APA, Harvard, Vancouver, ISO, and other styles
28

Chib, Siddhartha, and Edward Greenberg. "Understanding the Metropolis-Hastings Algorithm." American Statistician 49, no. 4 (1995): 327. http://dx.doi.org/10.2307/2684568.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Chib, Siddhartha, and Edward Greenberg. "Understanding the Metropolis-Hastings Algorithm." American Statistician 49, no. 4 (1995): 327–35. http://dx.doi.org/10.1080/00031305.1995.10476177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Yung, M. H., and A. Aspuru-Guzik. "A quantum-quantum Metropolis algorithm." Proceedings of the National Academy of Sciences 109, no. 3 (2012): 754–59. http://dx.doi.org/10.1073/pnas.1111758109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Cheon, Soo-Young, and Hee-Chan Lee. "Metropolis-Hastings Expectation Maximization Algorithm for Incomplete Data." Korean Journal of Applied Statistics 25, no. 1 (2012): 183–96. http://dx.doi.org/10.5351/kjas.2012.25.1.183.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Wang, Chen, Yi Wang, Kesheng Wang, Yao Dong, and Yang Yang. "An Improved Hybrid Algorithm Based on Biogeography/Complex and Metropolis for Many-Objective Optimization." Mathematical Problems in Engineering 2017 (2017): 1–14. http://dx.doi.org/10.1155/2017/2462891.

Full text
Abstract:
It is extremely important to maintain balance between convergence and diversity for many-objective evolutionary algorithms. Usually, original BBO algorithm can guarantee convergence to the optimal solution given enough generations, and the Biogeography/Complex (BBO/Complex) algorithm uses within-subsystem migration and cross-subsystem migration to preserve the convergence and diversity of the population. However, as the number of objectives increases, the performance of the algorithm decreases significantly. In this paper, a novel method to solve the many-objective optimization is called Hmp/BBO (Hybrid Metropolis Biogeography/Complex Based Optimization). The new decomposition method is adopted and the PBI function is put in place to improve the performance of the solution. On the within-subsystem migration the inferior migrated islands will not be chosen unless they pass the Metropolis criterion. With this restriction, a uniform distribution Pareto set can be obtained. In addition, through the above-mentioned method, algorithm running time is kept effectively. Experimental results on benchmark functions demonstrate the superiority of the proposed algorithm in comparison with five state-of-the-art designs in terms of both solutions to convergence and diversity.
APA, Harvard, Vancouver, ISO, and other styles
33

Fort, G., E. Moulines, G. O. Roberts, and J. S. Rosenthal. "On the geometric ergodicity of hybrid samplers." Journal of Applied Probability 40, no. 01 (2003): 123–46. http://dx.doi.org/10.1017/s0021900200022300.

Full text
Abstract:
In this paper, we consider the random-scan symmetric random walk Metropolis algorithm (RSM) on ℝ d . This algorithm performs a Metropolis step on just one coordinate at a time (as opposed to the full-dimensional symmetric random walk Metropolis algorithm, which proposes a transition on all coordinates at once). We present various sufficient conditions implying V-uniform ergodicity of the RSM when the target density decreases either subexponentially or exponentially in the tails.
APA, Harvard, Vancouver, ISO, and other styles
34

Fort, G., E. Moulines, G. O. Roberts, and J. S. Rosenthal. "On the geometric ergodicity of hybrid samplers." Journal of Applied Probability 40, no. 1 (2003): 123–46. http://dx.doi.org/10.1239/jap/1044476831.

Full text
Abstract:
In this paper, we consider the random-scan symmetric random walk Metropolis algorithm (RSM) on ℝd. This algorithm performs a Metropolis step on just one coordinate at a time (as opposed to the full-dimensional symmetric random walk Metropolis algorithm, which proposes a transition on all coordinates at once). We present various sufficient conditions implying V-uniform ergodicity of the RSM when the target density decreases either subexponentially or exponentially in the tails.
APA, Harvard, Vancouver, ISO, and other styles
35

Siltala, L., and M. Granvik. "Asteroid mass estimation with the robust adaptive Metropolis algorithm." Astronomy & Astrophysics 633 (January 2020): A46. http://dx.doi.org/10.1051/0004-6361/201935608.

Full text
Abstract:
Context. The bulk density of an asteroid informs us about its interior structure and composition. To constrain the bulk density, one needs an estimated mass of the asteroid. The mass is estimated by analyzing an asteroid’s gravitational interaction with another object, such as another asteroid during a close encounter. An estimate for the mass has typically been obtained with linearized least-squares methods, despite the fact that this family of methods is not able to properly describe non-Gaussian parameter distributions. In addition, the uncertainties reported for asteroid masses in the literature are sometimes inconsistent with each other and are suspected to be unrealistically low. Aims. We aim to present a Markov-chain Monte Carlo (MCMC) algorithm for the asteroid mass estimation problem based on asteroid-asteroid close encounters. We verify that our algorithm works correctly by applying it to synthetic data sets. We use astrometry available through the Minor Planet Center to estimate masses for a select few example cases and compare our results with results reported in the literature. Methods. Our mass-estimation method is based on the robust adaptive Metropolis algorithm that has been implemented into the OpenOrb asteroid orbit computation software. Our method has the built-in capability to analyze multiple perturbing asteroids and test asteroids simultaneously. Results. We find that our mass estimates for the synthetic data sets are fully consistent with the ground truth. The nominal masses for real example cases typically agree with the literature but tend to have greater uncertainties than what is reported in recent literature. Possible reasons for this include different astrometric data sets and weights, different test asteroids, different force models or different algorithms. For (16) Psyche, the target of NASA’s Psyche mission, our maximum likelihood mass is approximately 55% of what is reported in the literature. Such a low mass would imply that the bulk density is significantly lower than previously expected and thus disagrees with the theory of (16) Psyche being the metallic core of a protoplanet. We do, however, note that masses reported in recent literature remain within our 3-sigma limits. Results. The new MCMC mass-estimation algorithm performs as expected, but a rigorous comparison with results from a least-squares algorithm with the exact same data set remains to be done. The matters of uncertainties in comparison with other algorithms and correlations of observations also warrant further investigation.
APA, Harvard, Vancouver, ISO, and other styles
36

Müller, Christian, Fabian Weysser, Thomas Mrziglod, and Andreas Schuppert. "Markov-Chain Monte-Carlo methods and non-identifiabilities." Monte Carlo Methods and Applications 24, no. 3 (2018): 203–14. http://dx.doi.org/10.1515/mcma-2018-0018.

Full text
Abstract:
Abstract We consider the problem of sampling from high-dimensional likelihood functions with large amounts of non-identifiabilities via Markov-Chain Monte-Carlo algorithms. Non-identifiabilities are problematic for commonly used proposal densities, leading to a low effective sample size. To address this problem, we introduce a regularization method using an artificial prior, which restricts non-identifiable parts of the likelihood function. This enables us to sample the posterior using common MCMC methods more efficiently. We demonstrate this with three MCMC methods on a likelihood based on a complex, high-dimensional blood coagulation model and a single series of measurements. By using the approximation of the artificial prior for the non-identifiable directions, we obtain a sample quality criterion. Unlike other sample quality criteria, it is valid even for short chain lengths. We use the criterion to compare the following three MCMC variants: The Random Walk Metropolis Hastings, the Adaptive Metropolis Hastings and the Metropolis adjusted Langevin algorithm.
APA, Harvard, Vancouver, ISO, and other styles
37

Lenin, K. "A NOVEL HYBRIDIZED ALGORITHM FOR REDUCTION OF REAL POWER LOSS." International Journal of Research -GRANTHAALAYAH 5, no. 11 (2017): 316–24. http://dx.doi.org/10.29121/granthaalayah.v5.i11.2017.2358.

Full text
Abstract:
This paper proposes Hybridization of Gravitational Search algorithm with Simulated Annealing algorithm (HGS) for solving optimal reactive power problem. Individual position modernize strategy in Gravitational Search Algorithm (GSA) may cause damage to the individual position and also the local search capability of GSA is very weak. The new HGS algorithm introduced the idea of Simulated Annealing (SA) into Gravitational Search Algorithm (GSA), which took the Metropolis-principle-based individual position modernize strategy to perk up the particle moves, & after the operation of gravitation, Simulated Annealing operation has been applied to the optimal individual. In order to evaluate the efficiency of the proposed Hybridization of Gravitational Search algorithm with Simulated Annealing algorithm (HGS), it has been tested on standard IEEE 118 & practical 191 bus test systems and compared to the standard reported algorithms. Simulation results show that HGS is superior to other algorithms in reducing the real power loss and voltage profiles also within the limits.
APA, Harvard, Vancouver, ISO, and other styles
38

MALAKIS, A., S. S. MARTINOS, I. A. HADJIAGAPIOU, and A. S. PERATZAKIS. "ON THE WANG–LANDAU METHOD USING THE N-FOLD WAY." International Journal of Modern Physics C 15, no. 05 (2004): 729–40. http://dx.doi.org/10.1142/s0129183104006182.

Full text
Abstract:
We present a variation of the N-fold way algorithm, which improves efficiency when one combines the Wang–Landau method with the N-fold way. It is shown that the new version of the N-fold way algorithm has good performance when used for importance sampling and compared with the usual N-fold version or the Metropolis algorithm. The new N-fold algorithm combined with the Wang–Landau method is applied to the square Ising model using a multi-range approach. A comparative study is presented for all these algorithms, Wang–Landau and the two combined versions with the N-fold way. The role of boundary effects is discussed.
APA, Harvard, Vancouver, ISO, and other styles
39

JIANG, MINGHUI, and BINHAI ZHU. "PROTEIN FOLDING ON THE HEXAGONAL LATTICE IN THE HP MODEL." Journal of Bioinformatics and Computational Biology 03, no. 01 (2005): 19–34. http://dx.doi.org/10.1142/s0219720005000850.

Full text
Abstract:
In this paper, we introduce the 2D hexagonal lattice as a biologically meaningful alternative to the standard square lattice for the study of protein folding in the HP model. We show that the hexagonal lattice alleviates the "sharp turn" problem and models certain aspects of the protein secondary structure more realistically. We present a ⅙-approximation and a clustering heuristic for protein folding on the hexagonal lattice. In addition to these two algorithms, we also implement a Monte Carlo Metropolis algorithm and a branch-and-bound partial enumeration algorithm, and conduct experiments to compare their effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
40

Stoyan, Dietrich. "SIMULATION AND CHARACTERIZATION OF RANDOM SYSTEMS OF HARD PARTICLES." Image Analysis & Stereology 21, no. 4 (2011): 41. http://dx.doi.org/10.5566/ias.v21.ps41-s48.

Full text
Abstract:
This paper surveys methods for the simulation of random systems of hard particles, namely sedimentation and collective rearrangement algorithms, molecular dynamics, and Monte Carlo methods such as the Metropolis­ Hastings algorithm. Furthermore, some set-theoretic statistical characteristics are discussed: the covariance and topological descriptors such as specific connectivity numbers and Meck.e's morphological functions.
APA, Harvard, Vancouver, ISO, and other styles
41

Chauveau, Didier, and Pierre Vandekerkhove. "Algorithmes de Hastings–Metropolis en interaction." Comptes Rendus de l'Académie des Sciences - Series I - Mathematics 333, no. 9 (2001): 881–84. http://dx.doi.org/10.1016/s0764-4442(01)02147-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Chen, Pei-de. "Hastings-Metropolis algorithms and reference measures." Statistics & Probability Letters 38, no. 4 (1998): 323–28. http://dx.doi.org/10.1016/s0167-7152(98)00040-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Wang, Jin Hua, and Bo Yuan. "Implementation of Nonreversible Metropolis-Hastings algorithms." Journal of Physics: Conference Series 1087 (September 2018): 022004. http://dx.doi.org/10.1088/1742-6596/1087/2/022004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Cai, Bo, Renate Meyer, and François Perron. "Metropolis–Hastings algorithms with adaptive proposals." Statistics and Computing 18, no. 4 (2008): 421–33. http://dx.doi.org/10.1007/s11222-008-9051-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ren, Ruichao, C. J. O’Keeffe, and G. Orkoulas. "Sequential Metropolis Algorithms for Fluid Simulations." International Journal of Thermophysics 28, no. 2 (2007): 520–35. http://dx.doi.org/10.1007/s10765-007-0193-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Jerrum, Mark, and Gregory B. Sorkin. "The Metropolis algorithm for graph bisection." Discrete Applied Mathematics 82, no. 1-3 (1998): 155–75. http://dx.doi.org/10.1016/s0166-218x(97)00133-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Owen, A. B., and S. D. Tribble. "A quasi-Monte Carlo Metropolis algorithm." Proceedings of the National Academy of Sciences 102, no. 25 (2005): 8844–49. http://dx.doi.org/10.1073/pnas.0409596102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Diaconis, Persi, and J. W. Neuberger. "Numerical Results for the Metropolis Algorithm." Experimental Mathematics 13, no. 2 (2004): 207–13. http://dx.doi.org/10.1080/10586458.2004.10504534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Wu, Samuel S., and Martin T. Wells. "An Extension of the Metropolis Algorithm." Communications in Statistics - Theory and Methods 34, no. 3 (2005): 585–96. http://dx.doi.org/10.1081/sta-200052116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Marnissi, Yosra, Emilie Chouzenoux, Amel Benazza-Benyahia, and Jean-Christophe Pesquet. "Majorize–Minimize Adapted Metropolis–Hastings Algorithm." IEEE Transactions on Signal Processing 68 (2020): 2356–69. http://dx.doi.org/10.1109/tsp.2020.2983150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!