Добірка наукової літератури з теми "Non-reversible Markov chain"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Non-reversible Markov chain".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Non-reversible Markov chain":

1

Choi, Michael C. H., and Pierre Patie. "Analysis of non-reversible Markov chains via similarity orbits." Combinatorics, Probability and Computing 29, no. 4 (February 18, 2020): 508–36. http://dx.doi.org/10.1017/s0963548320000024.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
AbstractIn this paper we develop an in-depth analysis of non-reversible Markov chains on denumerable state space from a similarity orbit perspective. In particular, we study the class of Markov chains whose transition kernel is in the similarity orbit of a normal transition kernel, such as that of birth–death chains or reversible Markov chains. We start by identifying a set of sufficient conditions for a Markov chain to belong to the similarity orbit of a birth–death chain. As by-products, we obtain a spectral representation in terms of non-self-adjoint resolutions of identity in the sense of Dunford [21] and offer a detailed analysis on the convergence rate, separation cutoff and L2-cutoff of this class of non-reversible Markov chains. We also look into the problem of estimating the integral functionals from discrete observations for this class. In the last part of this paper we investigate a particular similarity orbit of reversible Markov kernels, which we call the pure birth orbit, and analyse various possibly non-reversible variants of classical birth–death processes in this orbit.
2

Qin, Liang, Philipp Höllmer, and Werner Krauth. "Direction-sweep Markov chains." Journal of Physics A: Mathematical and Theoretical 55, no. 10 (February 16, 2022): 105003. http://dx.doi.org/10.1088/1751-8121/ac508a.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract We discuss a non-reversible, lifted Markov-chain Monte Carlo (MCMC) algorithm for particle systems in which the direction of proposed displacements is changed deterministically. This algorithm sweeps through directions analogously to the popular MCMC sweep methods for particle or spin indices. Direction-sweep MCMC can be applied to a wide range of reversible or non-reversible Markov chains, such as the Metropolis algorithm or the event-chain Monte Carlo algorithm. For a single two-dimensional tethered hard-disk dipole, we consider direction-sweep MCMC in the limit where restricted equilibrium is reached among the accessible configurations for a fixed direction before incrementing it. We show rigorously that direction-sweep MCMC leaves the stationary probability distribution unchanged and that it profoundly modifies the Markov-chain trajectory. Long excursions, with persistent rotation in one direction, alternate with long sequences of rapid zigzags resulting in persistent rotation in the opposite direction in the limit of small direction increments. The mapping to a Langevin equation then yields the exact scaling of excursions while the zigzags are described through a non-linear differential equation that is solved exactly. We show that the direction-sweep algorithm can have shorter mixing times than the algorithms with random updates of directions. We point out possible applications of direction-sweep MCMC in polymer physics and in molecular simulation.
3

Höllmer, Philipp, A. C. Maggs, and Werner Krauth. "Hard-disk dipoles and non-reversible Markov chains." Journal of Chemical Physics 156, no. 8 (February 28, 2022): 084108. http://dx.doi.org/10.1063/5.0080101.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
We benchmark event-chain Monte Carlo (ECMC) algorithms for tethered hard-disk dipoles in two dimensions in view of application of ECMC to water models in molecular simulation. We characterize the rotation dynamics of dipoles through the integrated autocorrelation times of the polarization. The non-reversible straight, reflective, forward, and Newtonian ECMC algorithms are all event-driven and only move a single hard disk at any time. They differ only in their update rules at event times. We show that they realize considerable speedups with respect to the local reversible Metropolis algorithm with single-disk moves. We also find significant speed differences among the ECMC variants. Newtonian ECMC appears particularly well-suited for overcoming the dynamical arrest that has plagued straight ECMC for three-dimensional dipolar models with Coulomb interactions.
4

Dobson, P., I. Fursov, G. Lord, and M. Ottobre. "Reversible and non-reversible Markov chain Monte Carlo algorithms for reservoir simulation problems." Computational Geosciences 24, no. 3 (March 13, 2020): 1301–13. http://dx.doi.org/10.1007/s10596-020-09947-4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Vialaret, Marie, and Florian Maire. "On the Convergence Time of Some Non-Reversible Markov Chain Monte Carlo Methods." Methodology and Computing in Applied Probability 22, no. 3 (February 15, 2020): 1349–87. http://dx.doi.org/10.1007/s11009-019-09766-w.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Yang, Shangze, Di Xiao, Xuesong Li, and Zhen Ma. "Markov Chain Investigation of Discretization Schemes and Computational Cost Reduction in Modeling Photon Multiple Scattering." Applied Sciences 8, no. 11 (November 19, 2018): 2288. http://dx.doi.org/10.3390/app8112288.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Establishing fast and reversible photon multiple scattering algorithms remains a modeling challenge for optical diagnostics and noise reduction purposes, especially when the scattering happens within the intermediate scattering regime. Previous work has proposed and verified a Markov chain approach for modeling photon multiple scattering phenomena through turbid slabs. The fidelity of the Markov chain method has been verified through detailed comparison with Monte Carlo models. However, further improvement to the Markov chain method is still required to improve its performance in studying multiple scattering. The present research discussed the efficacy of non-uniform discretization schemes and analyzed errors induced by different schemes. The current work also proposed an iterative approach as an alternative to directly carrying out matrix inversion manipulations, which would significantly reduce the computational costs. The benefits of utilizing non-uniform discretization schemes and the iterative approach were confirmed and verified by comparing the results to a Monte Carlo simulation.
7

Kijima, Masaaki. "On passage and conditional passage times for Markov chains in continuous time." Journal of Applied Probability 25, no. 2 (June 1988): 279–90. http://dx.doi.org/10.2307/3214436.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Let X(t) be a temporally homogeneous irreducible Markov chain in continuous time defined on . For k < i < j, let H = {k + 1, ···, j − 1} and let kTij (jTik) be the upward (downward) conditional first-passage time of X(t) from i to j(k) given no visit to . These conditional passage times are studied through first-passage times of a modified chain HX(t) constructed by making the set of states absorbing. It will be shown that the densities of kTij and jTik for any birth-death process are unimodal and the modes kmij (jmik) of the unimodal densities are non-increasing (non-decreasing) with respect to i. Some distribution properties of kTij and jTik for a time-reversible Markov chain are presented. Symmetry among kTij, jTik, and is also discussed, where , and are conditional passage times of the reversed process of X(t).
8

Kijima, Masaaki. "On passage and conditional passage times for Markov chains in continuous time." Journal of Applied Probability 25, no. 02 (June 1988): 279–90. http://dx.doi.org/10.1017/s0021900200040924.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Let X(t) be a temporally homogeneous irreducible Markov chain in continuous time defined on . For k &lt; i &lt; j, let H = {k + 1, ···, j − 1} and let kTij ( jTik ) be the upward (downward) conditional first-passage time of X(t) from i to j(k) given no visit to . These conditional passage times are studied through first-passage times of a modified chain HX(t) constructed by making the set of states absorbing. It will be shown that the densities of kTij and jTik for any birth-death process are unimodal and the modes kmij ( jmik ) of the unimodal densities are non-increasing (non-decreasing) with respect to i. Some distribution properties of kTij and jTik for a time-reversible Markov chain are presented. Symmetry among kTij, jTik , and is also discussed, where , and are conditional passage times of the reversed process of X(t).
9

Vermolen, Fred, and Ilkka Pölönen. "Uncertainty quantification on a spatial Markov-chain model for the progression of skin cancer." Journal of Mathematical Biology 80, no. 3 (December 19, 2019): 545–73. http://dx.doi.org/10.1007/s00285-019-01367-y.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
AbstractA spatial Markov-chain model is formulated for the progression of skin cancer. The model is based on the division of the computational domain into nodal points, that can be in a binary state: either in ‘cancer state’ or in ‘non-cancer state’. The model assigns probabilities for the non-reversible transition from ‘non-cancer’ state to the ‘cancer state’ that depend on the states of the neighbouring nodes. The likelihood of transition further depends on the life burden intensity of the UV-rays that the skin is exposed to. The probabilistic nature of the process and the uncertainty in the input data is assessed by the use of Monte Carlo simulations. A good fit between experiments on mice and our model has been obtained.
10

Tran, Ha, and Kourosh Khoshelham. "Procedural Reconstruction of 3D Indoor Models from Lidar Data Using Reversible Jump Markov Chain Monte Carlo." Remote Sensing 12, no. 5 (March 5, 2020): 838. http://dx.doi.org/10.3390/rs12050838.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Automated reconstruction of Building Information Models (BIMs) from point clouds has been an intensive and challenging research topic for decades. Traditionally, 3D models of indoor environments are reconstructed purely by data-driven methods, which are susceptible to erroneous and incomplete data. Procedural-based methods such as the shape grammar are more robust to uncertainty and incompleteness of the data as they exploit the regularity and repetition of structural elements and architectural design principles in the reconstruction. Nevertheless, these methods are often limited to simple architectural styles: the so-called Manhattan design. In this paper, we propose a new method based on a combination of a shape grammar and a data-driven process for procedural modelling of indoor environments from a point cloud. The core idea behind the integration is to apply a stochastic process based on reversible jump Markov Chain Monte Carlo (rjMCMC) to guide the automated application of grammar rules in the derivation of a 3D indoor model. Experiments on synthetic and real data sets show the applicability of the method to efficiently generate 3D indoor models of both Manhattan and non-Manhattan environments with high accuracy, completeness, and correctness.

Дисертації з теми "Non-reversible Markov chain":

1

Xu, Jason Qian. "Markov Chain Monte Carlo and Non-Reversible Methods." Thesis, The University of Arizona, 2012. http://hdl.handle.net/10150/244823.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The bulk of Markov chain Monte Carlo applications make use of reversible chains, relying on the Metropolis-Hastings algorithm or similar methods. While reversible chains have the advantage of being relatively easy to analyze, it has been shown that non-reversible chains may outperform them in various scenarios. Neal proposes an algorithm that transforms a general reversible chain into a non-reversible chain with a construction that does not increase the asymptotic variance. These modified chains work to avoid diffusive backtracking behavior which causes Markov chains to be trapped in one position for too long. In this paper, we provide an introduction to MCMC, and discuss the Metropolis algorithm and Neal’s algorithm. We introduce a decaying memory algorithm inspired by Neal’s idea, and then analyze and compare the performance of these chains on several examples.
2

Enfroy, Aurélien. "Contributions à la conception, l'étude et la mise en œuvre de méthodes de Monte Carlo par chaîne de Markov appliquées à l'inférence bayésienne." Thesis, Institut polytechnique de Paris, 2022. http://www.theses.fr/2022IPPAS012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Cette thèse s'intéresse à l'analyse et la conception de méthodes de Monte Carlo par chaine de Markov (MCMC) utilisées dans l'échantillonnage en grande dimension. Elle est constituée de trois parties.La première introduit une nouvelle classe de chaînes de Markov et méthodes MCMC. Celles-ci permettent d'améliorer des méthodes MCMC à l'aide d'échantillons visant une restriction de la loi cible originale sur un domaine choisi par l'utilisateur. Cette procédure donne naissance à une nouvelle chaîne qui tire au mieux parti des propriétés de convergences des deux processus qui lui sont sous-jacents. En plus de montrer que cette chaîne vise toujours la mesure cible originale, nous établissons également des propriétés d'ergodicité sous des hypothèses faibles sur les noyaux de Markov mis en jeu.La seconde partie de ce document s'intéresse aux discrétisations de la diffusion de Langevin sous-amortie. Cette diffusion ne pouvant être calculée explicitement en général, il est classique de considérer des discrétisations. Cette thèse établie pour une large classe de discrétisations une condition de minoration uniforme en le pas de temps. Avec des hypothèses supplémentaires sur le potentiel, cela permet de montrer que ces discrétisations convergent géométriquement vers leur unique mesure de probabilité invariante en V-norme.La dernière partie étudie l'algorithme de Langevin non ajusté dans le cas où le gradient du potentiel est connu à une erreur uniformément bornée près. Cette partie fournie des bornes en V-norme et en distance de Wasserstein entre les itérations de l'algorithme avec le gradient exact et celle avec le gradient approché. Pour ce faire il est introduit une chaine de Markov auxiliaire qui borne la différence. Il est établi que cette chaîne auxiliaire converge en loi vers un processus dit collant déjà étudié dans la littérature pour la version continue de ce problème
This thesis focuses on the analysis and design of Markov chain Monte Carlo (MCMC) methods used in high-dimensional sampling. It consists of three parts.The first part introduces a new class of Markov chains and MCMC methods. These methods allow to improve MCMC methods by using samples targeting a restriction of the original target distribution on a domain chosen by the user. This procedure gives rise to a new chain that takes advantage of the convergence properties of the two underlying processes. In addition to showing that this chain always targets the original target measure, we also establish ergodicity properties under weak assumptions on the Markov kernels involved.The second part of this thesis focuses on discretizations of the underdamped Langevin diffusion. As this diffusion cannot be computed explicitly in general, it is classical to consider discretizations. This thesis establishes for a large class of discretizations a condition of uniform minimization in the time step. With additional assumptions on the potential, it shows that these discretizations converge geometrically to their unique V-invariant probability measure.The last part studies the unadjusted Langevin algorithm in the case where the gradient of the potential is known to within a uniformly bounded error. This part provides bounds in V-norm and in Wasserstein distance between the iterations of the algorithm with the exact gradient and the one with the approximated gradient. To do this, an auxiliary Markov chain is introduced that bounds the difference. It is established that this auxiliary chain converges in distribution to sticky process already studied in the literature for the continuous version of this problem
3

Huguet, Guillaume. "Étude d’algorithmes de simulation par chaînes de Markov non réversibles." Thesis, 2020. http://hdl.handle.net/1866/24345.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Les méthodes de Monte Carlo par chaînes de Markov (MCMC) utilisent généralement des chaînes de Markov réversibles. Jusqu’à récemment, une grande partie de la recherche théorique sur les chaînes de Markov concernait ce type de chaînes, notamment les théorèmes de Peskun (1973) et de Tierney (1998) qui permettent d’ordonner les variances asymptotiques de deux estimateurs issus de chaînes réversibles différentes. Dans ce mémoire nous analysons des algorithmes simulants des chaînes qui ne respectent pas cette condition. Nous parlons alors de chaînes non réversibles. Expérimentalement, ces chaînes produisent souvent des estimateurs avec une variance asymptotique plus faible et/ou une convergence plus rapide. Nous présentons deux algorithmes, soit l’algorithme de marche aléatoire guidée (GRW) par Gustafson (1998) et l’algorithme de discrete bouncy particle sampler (DBPS) par Sherlock et Thiery (2017). Pour ces deux algorithmes, nous comparons expérimentalement la variance asymptotique d’un estimateur avec la variance asymptotique en utilisant l’algorithme de Metropolis-Hastings. Récemment, un cadre théorique a été introduit par Andrieu et Livingstone (2019) pour ordonner les variances asymptotiques d’une certaine classe de chaînes non réversibles. Nous présentons leur analyse de GRW. De plus, nous montrons que le DBPS est inclus dans ce cadre théorique. Nous démontrons que la variance asymptotique d’un estimateur peut théoriquement diminuer en ajoutant des propositions à cet algorithme. Finalement, nous proposons deux modifications au DBPS. Tout au long du mémoire, nous serons intéressés par des chaînes issues de propositions déterministes. Nous montrons comment construire l’algorithme du delayed rejection avec des fonctions déterministes et son équivalent dans le cadre de Andrieu et Livingstone (2019).
Markov chain Monte Carlo (MCMC) methods commonly use chains that respect the detailed balance condition. These chains are called reversible. Most of the theory developed for MCMC evolves around those particular chains. Peskun (1973) and Tierney (1998) provided useful theorems on the ordering of the asymptotic variances for two estimators produced by two different reversible chains. In this thesis, we are interested in non-reversible chains, which are chains that don’t respect the detailed balance condition. We present algorithms that simulate non-reversible chains, mainly the Guided Random Walk (GRW) by Gustafson (1998) and the Discrete Bouncy Particle Sampler (DBPS) by Sherlock and Thiery (2017). For both algorithms, we compare the asymptotic variance of estimators with the ones produced by the Metropolis- Hastings algorithm. We present a recent theoretical framework introduced by Andrieu and Livingstone (2019) and their analysis of the GRW. We then show that the DBPS is part of this framework and present an analysis on the asymptotic variance of estimators. Their main theorem can provide an ordering of the asymptotic variances of two estimators resulting from nonreversible chains. We show that an estimator could have a lower asymptotic variance by adding propositions to the DBPS. We then present empirical results of a modified DBPS. Through the thesis we will mostly be interested in chains that are produced by deterministic proposals. We show a general construction of the delayed rejection algorithm using deterministic proposals and one possible equivalent for non-reversible chains.

Книги з теми "Non-reversible Markov chain":

1

Cheng, Russell. Finite Mixture Models. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198505044.003.0017.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Fitting a finite mixture model when the number of components, k, is unknown can be carried out using the maximum likelihood (ML) method though it is non-standard. Two well-known Bayesian Markov chain Monte Carlo (MCMC) methods are reviewed and compared with ML: the reversible jump method and one using an approximating Dirichlet process. Another Bayesian method, to be called MAPIS, is examined that first obtains point estimates for the component parameters by the maximum a posteriori method for different k and then estimates posterior distributions, including that for k, using importance sampling. MAPIS is compared with ML and the MCMC methods. The MCMC methods produce multimodal posterior parameter distributions in overfitted models. This results in the posterior distribution of k being biased towards high k. It is shown that MAPIS does not suffer from this problem. A simple numerical example is discussed.
2

Cheng, Russell. Finite Mixture Examples; MAPIS Details. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198505044.003.0018.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Two detailed numerical examples are given in this chapter illustrating and comparing mainly the reversible jump Markov chain Monte Carlo (RJMCMC) and the maximum a posteriori/importance sampling (MAPIS) methods. The numerical examples are the well-known galaxy data set with sample size 82, and the Hidalgo stamp issues thickness data with sample size 485. A comparison is made of the estimates obtained by the RJMCMC and MAPIS methods for (i) the posterior k-distribution of the number of components, k, (ii) the predictive finite mixture distribution itself, and (iii) the posterior distributions of the component parameters and weights. The estimates obtained by MAPIS are shown to be more satisfactory and meaningful. Details are given of the practical implementation of MAPIS for five non-normal mixture models, namely: the extreme value, gamma, inverse Gaussian, lognormal, and Weibull. Mathematical details are also given of the acceptance-rejection importance sampling used in MAPIS.

Частини книг з теми "Non-reversible Markov chain":

1

Fulman, Jason. "Stein’s method and non-reversible Markov chains." In Institute of Mathematical Statistics Lecture Notes - Monograph Series, 66–74. Beachwood, Ohio, USA: Institute of Mathematical Statistics, 2004. http://dx.doi.org/10.1214/lnms/1196283800.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Non-reversible Markov chain":

1

Gao, Chenjun, Jingjing He, and Xuefei Guan. "A Novel Probability of Detection Assessment Considering Model Uncertainty for Lamb Wave Detection." In 2021 48th Annual Review of Progress in Quantitative Nondestructive Evaluation. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/qnde2021-74014.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract Uncertainty in Non-Destructive Evaluation (NDE) arises from many sources, e.g., manufacturing variability, environmental noise, and inadequate measurement devices. The reliability of the NDE measurements is typically quantified by the probability of detection (POD). With the advent and technical developments of the simulation method and computer science, efforts have been devoted to generating and estimating the POD curve for Lamb wave damage detection. However, few studies have been reported on the POD evaluation considering model selection uncertainty. This paper presents a novel POD assessment method incorporating model selection uncertainty for Lamb wave damage detection. By treating the flaw quantification model as a discrete uncertain variable, a hierarchical probabilistic model for Lamb wave POD is formulated in the Bayesian framework. Uncertainties from the model choice, model parameters, and other variables can be explicitly incorporated using the proposed method. The Bayes factor is used to evaluate the performance of models. The posterior distributions of model parameters and the model fusion results are calculated through the Bayesian update using the reversible jump Markov chain Monte Carlo method. A fatigue problem with naturally developed cracks is used to demonstrate the proposed method.

До бібліографії