To see the other types of publications on this topic, follow the link: Markov chains non-homogeneous.

Journal articles on the topic 'Markov chains non-homogeneous'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Markov chains non-homogeneous.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Dey, Agnish, and Arunava Mukherjea. "Collapsing of non-homogeneous Markov chains." Statistics & Probability Letters 84 (January 2014): 140–48. http://dx.doi.org/10.1016/j.spl.2013.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pawłowski, Janusz. "Poisson theorem for non-homogeneous Markov chains." Journal of Applied Probability 26, no. 3 (September 1989): 637–42. http://dx.doi.org/10.2307/3214421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Řezníček, Jan, Martin Kohlík, and Hana Kubátová. "Non-homogeneous hierarchical Continuous Time Markov Chains." Microprocessors and Microsystems 78 (October 2020): 103206. http://dx.doi.org/10.1016/j.micpro.2020.103206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Pawłowski, Janusz. "Poisson theorem for non-homogeneous Markov chains." Journal of Applied Probability 26, no. 03 (September 1989): 637–42. http://dx.doi.org/10.1017/s0021900200038237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tang, Ying, Weiguo Yang, and Yue Zhang. "THE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINS." Probability in the Engineering and Informational Sciences 33, no. 2 (April 2, 2018): 161–71. http://dx.doi.org/10.1017/s0269964818000074.

Full text
Abstract:
In this paper, we are going to study the strong limit theorem for the relative entropy density rates between two finite asymptotically circular Markov chains. Firstly, we prove some lammas on which the main result based. Then, we establish two strong limit theorem for non-homogeneous Markov chains. Finally, we obtain the main result of this paper. As corollaries, we get the strong limit theorem for the relative entropy density rates between two finite non-homogeneous Markov chains. We also prove that the relative entropy density rates between two finite non-homogeneous Markov chains are uniformly integrable under some conditions.
APA, Harvard, Vancouver, ISO, and other styles
6

Hadjiloucas, Demetris. "Stochastic matrix-valued cocycles and non-homogeneous Markov chains." Discrete & Continuous Dynamical Systems - A 17, no. 4 (2007): 731–38. http://dx.doi.org/10.3934/dcds.2007.17.731.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zeifman, A. I. "Quasi-ergodicity for non-homogeneous continuous-time Markov chains." Journal of Applied Probability 26, no. 3 (September 1989): 643–48. http://dx.doi.org/10.2307/3214422.

Full text
Abstract:
We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l1. Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.
APA, Harvard, Vancouver, ISO, and other styles
8

Zeifman, A. I. "Quasi-ergodicity for non-homogeneous continuous-time Markov chains." Journal of Applied Probability 26, no. 03 (September 1989): 643–48. http://dx.doi.org/10.1017/s0021900200038249.

Full text
Abstract:
We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l 1 . Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.
APA, Harvard, Vancouver, ISO, and other styles
9

Bisbas, Antonis. "A Cassels–Schmidt theorem for non-homogeneous Markov chains." Bulletin des Sciences Mathématiques 129, no. 1 (January 2005): 25–37. http://dx.doi.org/10.1016/j.bulsci.2004.06.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Vassiliou, P. C. G. "On the periodicity of non-homogeneous Markov chains and systems." Linear Algebra and its Applications 471 (April 2015): 654–84. http://dx.doi.org/10.1016/j.laa.2015.01.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Platis, Agapios, Nikolaos Limnios, and Marc Le Du. "Dependability analysis of systems modeled by non-homogeneous Markov chains." Reliability Engineering & System Safety 61, no. 3 (September 1998): 235–49. http://dx.doi.org/10.1016/s0951-8320(97)00073-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Golomozyĭ, V. V., and M. V. Kartashov. "Maximal coupling and stability of discrete non-homogeneous Markov chains." Theory of Probability and Mathematical Statistics 91 (February 3, 2016): 17–27. http://dx.doi.org/10.1090/tpms/963.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Chattopadhyay, Rita. "Non-homogeneous Markov chains with a finite state space and a Doeblin type theorem." International Journal of Mathematics and Mathematical Sciences 18, no. 2 (1995): 365–70. http://dx.doi.org/10.1155/s0161171295000457.

Full text
Abstract:
Doeblin [1] considered some classes of finite state nonhomogeneous Markov chains and studied their asymptotic behavior. Later Cohn [2] considered another class of such Markov chains (not covered earlier) and obtained Doeblin type results. Though this paper does not present the “best possible” results, the method of proof will be of interest to the reader. It is elementary and based on Hajnal's results on products of nonnegative matrices.
APA, Harvard, Vancouver, ISO, and other styles
14

Sung, Minje. "Bayesian Hierarchical Mixed Effects Analysis of Time Non-Homogeneous Markov Chains." Korean Journal of Applied Statistics 27, no. 2 (April 30, 2014): 263–75. http://dx.doi.org/10.5351/kjas.2014.27.2.263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Sethuraman, Sunder, and Srinivasa Varadhan. "A Martingale Proof of Dobrushin's Theorem for Non-Homogeneous Markov Chains." Electronic Journal of Probability 10 (2005): 1221–35. http://dx.doi.org/10.1214/ejp.v10-283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Platis, A., N. Limnios, and M. Le Du. "Performability of electric-power systems modeled by non-homogeneous Markov chains." IEEE Transactions on Reliability 45, no. 4 (1996): 605–10. http://dx.doi.org/10.1109/24.556582.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Fleischer, I., and A. Joffe. "Ratio ergodicity for non-homogeneous Markov Chains in general state spaces." Journal of Theoretical Probability 8, no. 1 (January 1995): 31–37. http://dx.doi.org/10.1007/bf02213452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Peligrad, Magda. "Central limit theorem for triangular arrays of non-homogeneous Markov chains." Probability Theory and Related Fields 154, no. 3-4 (June 3, 2011): 409–28. http://dx.doi.org/10.1007/s00440-011-0371-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Niemiro, Wojciech. "Tail events of simulated annealing Markov chains." Journal of Applied Probability 32, no. 4 (December 1995): 867–76. http://dx.doi.org/10.2307/3215200.

Full text
Abstract:
We consider non-homogeneous Markov chains generated by the simulated annealing algorithm. We classify states according to asymptotic properties of trajectories. We identify recurrent and transient states. The set of recurrent states is partitioned into disjoint classes of asymptotically communicating states. These classes correspond to atoms of the tail sigma-field. The results are valid under the weak reversibility assumption of Hajek.
APA, Harvard, Vancouver, ISO, and other styles
20

Niemiro, Wojciech. "Tail events of simulated annealing Markov chains." Journal of Applied Probability 32, no. 04 (December 1995): 867–76. http://dx.doi.org/10.1017/s0021900200103341.

Full text
Abstract:
We consider non-homogeneous Markov chains generated by the simulated annealing algorithm. We classify states according to asymptotic properties of trajectories. We identify recurrent and transient states. The set of recurrent states is partitioned into disjoint classes of asymptotically communicating states. These classes correspond to atoms of the tail sigma-field. The results are valid under the weak reversibility assumption of Hajek.
APA, Harvard, Vancouver, ISO, and other styles
21

Sung, Minje, Refik Soyer, and Nguyen Nhan. "Bayesian analysis of non-homogeneous Markov chains: Application to mental health data." Statistics in Medicine 26, no. 15 (2007): 3000–3017. http://dx.doi.org/10.1002/sim.2775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Georgiou, Andreas C., Alexandra Papadopoulou, Pavlos Kolias, Haris Palikrousis, and Evanthia Farmakioti. "On State Occupancies, First Passage Times and Duration in Non-Homogeneous Semi-Markov Chains." Mathematics 9, no. 15 (July 24, 2021): 1745. http://dx.doi.org/10.3390/math9151745.

Full text
Abstract:
Semi-Markov processes generalize the Markov chains framework by utilizing abstract sojourn time distributions. They are widely known for offering enhanced accuracy in modeling stochastic phenomena. The aim of this paper is to provide closed analytic forms for three types of probabilities which describe attributes of considerable research interest in semi-Markov modeling: (a) the number of transitions to a state through time (Occupancy), (b) the number of transitions or the amount of time required to observe the first passage to a state (First passage time) and (c) the number of transitions or the amount of time required after a state is entered before the first real transition is made to another state (Duration). The non-homogeneous in time recursive relations of the above probabilities are developed and a description of the corresponding geometric transforms is produced. By applying appropriate properties, the closed analytic forms of the above probabilities are provided. Finally, data from human DNA sequences are used to illustrate the theoretical results of the paper.
APA, Harvard, Vancouver, ISO, and other styles
23

Esquível, Manuel L., Gracinda R. Guerreiro, Matilde C. Oliveira, and Pedro Corte Real. "Calibration of Transition Intensities for a Multistate Model: Application to Long-Term Care." Risks 9, no. 2 (February 8, 2021): 37. http://dx.doi.org/10.3390/risks9020037.

Full text
Abstract:
We consider a non-homogeneous continuous time Markov chain model for Long-Term Care with five states: the autonomous state, three dependent states of light, moderate and severe dependence levels and the death state. For a general approach, we allow for non null intensities for all the returns from higher dependence levels to all lesser dependencies in the multi-state model. Using data from the 2015 Portuguese National Network of Continuous Care database, as the main research contribution of this paper, we propose a method to calibrate transition intensities with the one step transition probabilities estimated from data. This allows us to use non-homogeneous continuous time Markov chains for modeling Long-Term Care. We solve numerically the Kolmogorov forward differential equations in order to obtain continuous time transition probabilities. We assess the quality of the calibration using the Portuguese life expectancies. Based on reasonable monthly costs for each dependence state we compute, by Monte Carlo simulation, trajectories of the Markov chain process and derive relevant information for model validation and premium calculation.
APA, Harvard, Vancouver, ISO, and other styles
24

Johnson, Jean, and Dean Isaacson. "Conditions for strong ergodicity using intensity matrices." Journal of Applied Probability 25, no. 1 (March 1988): 34–42. http://dx.doi.org/10.2307/3214231.

Full text
Abstract:
Sufficient conditions for strong ergodicity of discrete-time non-homogeneous Markov chains have been given in several papers. Conditions have been given using the left eigenvectors ψn of Pn(ψ nPn = ψ n) and also using the limiting behavior of Pn. In this paper we consider the analogous results in the case of continuous-time Markov chains where one uses the intensity matrices Q(t) instead of P(s, t). A bound on the rate of convergence of certain strongly ergodic chains is also given.
APA, Harvard, Vancouver, ISO, and other styles
25

Johnson, Jean, and Dean Isaacson. "Conditions for strong ergodicity using intensity matrices." Journal of Applied Probability 25, no. 01 (March 1988): 34–42. http://dx.doi.org/10.1017/s0021900200040614.

Full text
Abstract:
Sufficient conditions for strong ergodicity of discrete-time non-homogeneous Markov chains have been given in several papers. Conditions have been given using the left eigenvectors ψ n of Pn (ψ nPn = ψ n ) and also using the limiting behavior of Pn. In this paper we consider the analogous results in the case of continuous-time Markov chains where one uses the intensity matrices Q(t) instead of P(s, t). A bound on the rate of convergence of certain strongly ergodic chains is also given.
APA, Harvard, Vancouver, ISO, and other styles
26

Bansaye, Vincent, and Chunmao Huang. "Weak law of large numbers for some Markov chains along non homogeneous genealogies." ESAIM: Probability and Statistics 19 (2015): 307–26. http://dx.doi.org/10.1051/ps/2014027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Yang, Jie, Weiguo Yang, Zhiyan Shi, Yiqing Li, Bei Wang, and Yue Zhang. "Strong law of large numbers for generalized sample relative entropy of non homogeneous Markov chains." Communications in Statistics - Theory and Methods 47, no. 7 (September 27, 2017): 1571–79. http://dx.doi.org/10.1080/03610926.2017.1321770.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Kijima, Masaaki. "On passage and conditional passage times for Markov chains in continuous time." Journal of Applied Probability 25, no. 2 (June 1988): 279–90. http://dx.doi.org/10.2307/3214436.

Full text
Abstract:
Let X(t) be a temporally homogeneous irreducible Markov chain in continuous time defined on . For k < i < j, let H = {k + 1, ···, j − 1} and let kTij (jTik) be the upward (downward) conditional first-passage time of X(t) from i to j(k) given no visit to . These conditional passage times are studied through first-passage times of a modified chain HX(t) constructed by making the set of states absorbing. It will be shown that the densities of kTij and jTik for any birth-death process are unimodal and the modes kmij (jmik) of the unimodal densities are non-increasing (non-decreasing) with respect to i. Some distribution properties of kTij and jTik for a time-reversible Markov chain are presented. Symmetry among kTij, jTik, and is also discussed, where , and are conditional passage times of the reversed process of X(t).
APA, Harvard, Vancouver, ISO, and other styles
29

Kijima, Masaaki. "On passage and conditional passage times for Markov chains in continuous time." Journal of Applied Probability 25, no. 02 (June 1988): 279–90. http://dx.doi.org/10.1017/s0021900200040924.

Full text
Abstract:
Let X(t) be a temporally homogeneous irreducible Markov chain in continuous time defined on . For k &lt; i &lt; j, let H = {k + 1, ···, j − 1} and let kTij ( jTik ) be the upward (downward) conditional first-passage time of X(t) from i to j(k) given no visit to . These conditional passage times are studied through first-passage times of a modified chain HX(t) constructed by making the set of states absorbing. It will be shown that the densities of kTij and jTik for any birth-death process are unimodal and the modes kmij ( jmik ) of the unimodal densities are non-increasing (non-decreasing) with respect to i. Some distribution properties of kTij and jTik for a time-reversible Markov chain are presented. Symmetry among kTij, jTik , and is also discussed, where , and are conditional passage times of the reversed process of X(t).
APA, Harvard, Vancouver, ISO, and other styles
30

Kim, Seong-Woo, and Bong-Kyoo Yoon. "An Analysis on the Identification Rate of Detection System Using Non-Homogeneous Discrete Absorbing Markov Chains." Journal of the Korean Operations Research and Management Science Society 40, no. 2 (May 31, 2015): 31–42. http://dx.doi.org/10.7737/jkorms.2015.40.2.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

He, Qi, and G. Yin. "Moderate deviations for time-varying dynamic systems driven by non-homogeneous Markov chains with Two-time Scales." Stochastics 86, no. 3 (October 25, 2013): 527–50. http://dx.doi.org/10.1080/17442508.2013.841695.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Crippa, F., M. Mazzoleni, and M. Zenga. "Departures from the formal of actual students' university careers: an application of non-homogeneous fuzzy Markov chains." Journal of Applied Statistics 43, no. 1 (October 21, 2015): 16–30. http://dx.doi.org/10.1080/02664763.2015.1091446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Golomozyĭ, V. V. "Estimates of stability of transition probabilities for non-homogeneous Markov chains in the case of the uniform minorization." Theory of Probability and Mathematical Statistics 101 (January 5, 2021): 85–101. http://dx.doi.org/10.1090/tpms/1113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ponomarev, Denis, Elisabeth Rodier, Martial Sauceau, Clémence Nikitine, Vadim Mizonov, and Jacques Fages. "Modelling non-homogeneous flow and residence time distribution in a single-screw extruder by means of Markov chains." Journal of Mathematical Chemistry 50, no. 8 (April 26, 2012): 2141–54. http://dx.doi.org/10.1007/s10910-012-0022-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

D’Amico, Guglielmo, Raimondo Manca, Filippo Petroni, and Dharmaraja Selvamuthu. "On the Computation of Some Interval Reliability Indicators for Semi-Markov Systems." Mathematics 9, no. 5 (March 8, 2021): 575. http://dx.doi.org/10.3390/math9050575.

Full text
Abstract:
In this paper, we computed general interval indicators of availability and reliability for systems modelled by time non-homogeneous semi-Markov chains. First, we considered duration-dependent extensions of the Interval Reliability and then, we determined an explicit formula for the availability with a given window and containing a given point. To make the computation of the window availability, an explicit formula was derived involving duration-dependent transition probabilities and the interval reliability function. Both interval reliability and availability functions were evaluated considering the local behavior of the system through the recurrence time processes. The results are illustrated through a numerical example. They show that the considered indicators can describe the duration effects and the age of the multi-state system and be useful in real-life problems.
APA, Harvard, Vancouver, ISO, and other styles
36

Rakitskii, Valerii N., Natalya G. Zavolokina, and Irina V. Bereznyak. "A probabilistic model for risk assessment and predicting the health risk of occupational exposure to pesticides in agriculture." Hygiene and sanitation 100, no. 9 (September 20, 2021): 969–74. http://dx.doi.org/10.47470/0016-9900-2021-100-9-969-974.

Full text
Abstract:
Introduction. The main point is the influence of a complex of chemical and physical stressors on agricultural machine operators. The processes of occurrence and interaction of harmful factors are probable. Markov processes are a convenient model that can describe the behaviour of physical processes with random dynamics. Purpose of the work: was to develop a probabilistic model of risk assessment for agriculture workers during the application of pesticides based on Markov processes’ theory and evaluate with the help of the developed model the probability of occurrence, the degree of severity and the prediction of the different influence of adverse factors on the operator. Materials and methods. The mechanized treatment of pesticide is presented in the form of a system, the states of which are ranked according to the degree of danger to the operator: from non-dangerous to dangerous. The transition occurs under the influence of negative factors and is characterized by the probability of pij transition. Based on the marked graph of the system states, a stochastic matrix P[ij] of transition probabilities was constructed in one step. There are formulas by which it is possible to calculate the state of systems in k steps for a homogeneous and non-homogeneous Markov chain. Results. Based on Markov chains’ theory, the system’s behaviour is modelled when using single-component preparations based on imidacloprid for rod spraying of field crops. Received vector of probabilities of possible hazardous conditions for the employee after each hour of spraying within 10 hours. After 6 hours of working, the probabilistic risk for the operator to stay in a non-dangerous state is about 50 %, and the probability risk of going into a dangerous - at 24 %. The stationary probability distribution results show the inevitability of the transition to a hazardous state of the system if enough steps have been taken. Conclusion. With this model, you can supplement the operator’s health risk assessment system, analyze, compare and summarize the results of years of research. The calculated statistical probabilities can be used in the development of new hygiene regulations with using pesticides.
APA, Harvard, Vancouver, ISO, and other styles
37

Dang, Hui. "The strong law of large numbers for non homogeneous M-bifurcating Markov chains indexed by a M-branch Cayley tree." Communications in Statistics - Theory and Methods 47, no. 9 (January 2, 2018): 2110–25. http://dx.doi.org/10.1080/03610926.2017.1335417.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Carpinteyro, Martha, Francisco Venegas-Martínez, and Alí Aali-Bujari. "Modeling Precious Metal Returns through Fractional Jump-Diffusion Processes Combined with Markov Regime-Switching Stochastic Volatility." Mathematics 9, no. 4 (February 19, 2021): 407. http://dx.doi.org/10.3390/math9040407.

Full text
Abstract:
This paper is aimed at developing a stochastic volatility model that is useful to explain the dynamics of the returns of gold, silver, and platinum during the period 1994–2019. To this end, it is assumed that the precious metal returns are driven by fractional Brownian motions, combined with Poisson processes and modulated by continuous-time homogeneous Markov chains. The calibration is carried out by estimating the Jump Generalized Autoregressive Conditional Heteroscedasticity (Jump-GARCH) and Markov regime-switching models of each precious metal, as well as computing their Hurst exponents. The novelty in this research is the use of non-linear, non-normal, multi-factor, time-varying risk stochastic models, useful for an investors’ decision-making process when they intend to include precious metals in their portfolios as safe-haven assets. The main empirical results are as follows: (1) all metals stay in low volatility most of the time and have long memories, which means that past returns have an effect on current and future returns; (2) silver and platinum have the largest jump sizes; (3) silver’s negative jumps have the highest intensity; and (4) silver reacts more than gold and platinum, and it is also the most volatile, having the highest probability of intensive jumps. Gold is the least volatile, as its percentage of jumps is the lowest and the intensity of its jumps is lower than that of the other two metals. Finally, a set of recommendations is provided for the decision-making process of an average investor looking to buy and sell precious metals.
APA, Harvard, Vancouver, ISO, and other styles
39

程, 成. "A Limit Property for Non-Homogeneous Markov Chain in Markov Environments." Pure Mathematics 06, no. 01 (2016): 81–87. http://dx.doi.org/10.12677/pm.2016.61012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Gerontidis, Ioannis I. "Periodic strong ergodicity in non-homogeneous Markov systems." Journal of Applied Probability 28, no. 1 (March 1991): 58–73. http://dx.doi.org/10.2307/3214740.

Full text
Abstract:
This paper presents a unified treatment of the convergence properties of nonhomogeneous Markov systems under different sets of assumptions. First the periodic case is studied and the limiting evolution of the individual cyclically moving subclasses of the state space of the associated Markov replacement chain is completely determined. A special case of the above result is the aperiodic or strongly ergodic convergence. Two numerical examples from the literature on manpower planning highlight the practical aspect of the theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
41

Gerontidis, Ioannis I. "Periodic strong ergodicity in non-homogeneous Markov systems." Journal of Applied Probability 28, no. 01 (March 1991): 58–73. http://dx.doi.org/10.1017/s0021900200039425.

Full text
Abstract:
This paper presents a unified treatment of the convergence properties of nonhomogeneous Markov systems under different sets of assumptions. First the periodic case is studied and the limiting evolution of the individual cyclically moving subclasses of the state space of the associated Markov replacement chain is completely determined. A special case of the above result is the aperiodic or strongly ergodic convergence. Two numerical examples from the literature on manpower planning highlight the practical aspect of the theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
42

HanChao, WANG, and ZHOU LiKai. "Weak convergence of functionals of non-homogeneous Markov chain." SCIENTIA SINICA Mathematica 47, no. 6 (March 3, 2017): 757–64. http://dx.doi.org/10.1360/scm-2016-0479.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Long, Yuannan, Rong Tang, Hui Wang, and Changbo Jiang. "Monthly precipitation modeling using Bayesian Non-homogeneous Hidden Markov Chain." Hydrology Research 50, no. 2 (November 14, 2018): 562–76. http://dx.doi.org/10.2166/nh.2018.077.

Full text
Abstract:
Abstract Monthly precipitation modeling is important in various applications, e.g. streamflow forecasts and water resources management. This paper develops an operational precipitation forecasting scheme, using Bayesian Non-homogeneous Hidden Markov Chain (NHMM) model and teleconnection index. Although the Hidden Markov Chain model has been investigated before in similar studies, the NHMM algorithm employed in this study allows modeling both non-stationary transition probabilities and emission matrix. Climatic teleconnection that affect precipitation is used to drive changes in transition probabilities of different states in the Markov model. The proposed framework is illustrated for multiple-station precipitation analysis in NingXiang County, a southern inland area in China with a high population density. A simulation model is constructed to examine the model's capacity in capturing variabilities and temporal-spatial characteristics exhibiting in monthly precipitation data during 1961–2013. Results indicate that the proposed NHMM model captures the precipitation characteristics at different stations well. Spearman correlation between conditional mean of simulated ensembles and observed data is 0.87–0.9, with few variations at distinct stations. The proposed framework has general applications and can be applied to simulate and generate stochastic monthly precipitation. Further application of the method is also described in the paper.
APA, Harvard, Vancouver, ISO, and other styles
44

Pereira, André G. C., and Viviane S. M. Campos. "Multistage non homogeneous Markov chain modeling of the non homogeneous genetic algorithm and convergence results." Communications in Statistics - Theory and Methods 45, no. 6 (November 7, 2015): 1794–804. http://dx.doi.org/10.1080/03610926.2014.997358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Norberg, Ragnar. "The Markov Chain Market." ASTIN Bulletin 33, no. 02 (November 2003): 265–87. http://dx.doi.org/10.2143/ast.33.2.503693.

Full text
Abstract:
We consider a financial market driven by a continuous time homogeneous Markov chain. Conditions for absence of arbitrage and for completeness are spelled out, non-arbitrage pricing of derivatives is discussed, and details are worked out for some cases. Closed form expressions are obtained for interest rate derivatives. Computations typically amount to solving a set of first order partial differential equations. An excursion into risk minimization in the incomplete case illustrates the matrix techniques that are instrumental in the model.
APA, Harvard, Vancouver, ISO, and other styles
46

Norberg, Ragnar. "The Markov Chain Market." ASTIN Bulletin 33, no. 2 (November 2003): 265–87. http://dx.doi.org/10.1017/s0515036100013465.

Full text
Abstract:
We consider a financial market driven by a continuous time homogeneous Markov chain. Conditions for absence of arbitrage and for completeness are spelled out, non-arbitrage pricing of derivatives is discussed, and details are worked out for some cases. Closed form expressions are obtained for interest rate derivatives. Computations typically amount to solving a set of first order partial differential equations. An excursion into risk minimization in the incomplete case illustrates the matrix techniques that are instrumental in the model.
APA, Harvard, Vancouver, ISO, and other styles
47

Vassiliou, P. C. G. "Non-Homogeneous Semi-Markov and Markov Renewal Processes and Change of Measure in Credit Risk." Mathematics 9, no. 1 (December 29, 2020): 55. http://dx.doi.org/10.3390/math9010055.

Full text
Abstract:
For a G-inhomogeneous semi-Markov chain and G-inhomogeneous Markov renewal processes, we study the change from real probability measure into a forward probability measure. We find the values of risky bonds using the forward probabilities that the bond will not default up to maturity time for both processes. It is established in the form of a theorem that the forward probability measure does not alter the semi Markov structure. In addition, foundation of a G-inhohomogeneous Markov renewal process is done and a theorem is provided where it is proved that the Markov renewal process is maintained under the forward probability measure. We show that for an inhomogeneous semi-Markov there are martingales that characterize it. We show that the same is true for a Markov renewal processes. We discuss in depth the calibration of the G-inhomogeneous semi-Markov chain model and propose an algorithm for it. We conclude with an application for risky bonds.
APA, Harvard, Vancouver, ISO, and other styles
48

D'Amico, Guglielmo. "The crossing barrier of a non-homogeneous semi-Markov chain." Stochastics 81, no. 6 (November 30, 2009): 589–600. http://dx.doi.org/10.1080/17442500903278892.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Guilbault, Jean-Luc, and Mario Lefebvre. "On a non-homogeneous difference equation from probability theory." Tatra Mountains Mathematical Publications 43, no. 1 (December 1, 2009): 81–90. http://dx.doi.org/10.2478/v10127-009-0027-4.

Full text
Abstract:
Abstract The so-called gambler’s ruin problem in probability theory is considered for a Markov chain having transition probabilities depending on the current state. This problem leads to a non-homogeneous difference equation with non-constant coefficients for the expected duration of the game. This mathematical expectation is computed explicitly.
APA, Harvard, Vancouver, ISO, and other styles
50

Giroux, Gaston. "Asymptotic results for non-linear processes of the McKean tagged-molecule type." Journal of Applied Probability 23, no. 1 (March 1986): 42–51. http://dx.doi.org/10.2307/3214115.

Full text
Abstract:
McKean's tagged-molecule process is a non-linear homogeneous two-state Markov chain in continuous time, constructed with the aid of a binary branching process. For each of a large class of branching processes we construct a similar process. The construction is carefully done and the weak homogeneity is deduced. A simple probability argument permits us to show convergence to the equidistribution (½, ½) and to note that this limit is a strong equilibrium. A non-homogeneous Markov chain result is also used to establish the geometric rate of convergence. A proof of a Boltzmann H-theorem is also established.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography