Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Modelling, Markov chain.

Artykuły w czasopismach na temat „Modelling, Markov chain”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Modelling, Markov chain”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

BOUCHER, THOMAS R., and DAREN B. H. CLINE. "PIGGYBACKING THRESHOLD PROCESSES WITH A FINITE STATE MARKOV CHAIN." Stochastics and Dynamics 09, no. 02 (June 2009): 187–204. http://dx.doi.org/10.1142/s0219493709002622.

Pełny tekst źródła
Streszczenie:
The state-space representations of certain nonlinear autoregressive time series are general state Markov chains. The transitions of a general state Markov chain among regions in its state-space can be modeled with the transitions among states of a finite state Markov chain. Stability of the time series is then informed by the stationary distributions of the finite state Markov chain. This approach generalizes some previous results.
Style APA, Harvard, Vancouver, ISO itp.
2

Gerontidis, Ioannis I. "Semi-Markov Replacement Chains." Advances in Applied Probability 26, no. 03 (September 1994): 728–55. http://dx.doi.org/10.1017/s0001867800026525.

Pełny tekst źródła
Streszczenie:
We consider an absorbing semi-Markov chain for which each time absorption occurs there is a resetting of the chain according to some initial (replacement) distribution. The new process is a semi-Markov replacement chain and we study its properties in terms of those of the imbedded Markov replacement chain. A time-dependent version of the model is also defined and analysed asymptotically for two types of environmental behaviour, i.e. either convergent or cyclic. The results contribute to the control theory of semi-Markov chains and extend in a natural manner a wide variety of applied probabilit
Style APA, Harvard, Vancouver, ISO itp.
3

Gerontidis, Ioannis I. "Semi-Markov Replacement Chains." Advances in Applied Probability 26, no. 3 (September 1994): 728–55. http://dx.doi.org/10.2307/1427818.

Pełny tekst źródła
Streszczenie:
We consider an absorbing semi-Markov chain for which each time absorption occurs there is a resetting of the chain according to some initial (replacement) distribution. The new process is a semi-Markov replacement chain and we study its properties in terms of those of the imbedded Markov replacement chain. A time-dependent version of the model is also defined and analysed asymptotically for two types of environmental behaviour, i.e. either convergent or cyclic. The results contribute to the control theory of semi-Markov chains and extend in a natural manner a wide variety of applied probabilit
Style APA, Harvard, Vancouver, ISO itp.
4

Faddy, M. J., and S. I. McClean. "Markov Chain Modelling for Geriatric Patient Care." Methods of Information in Medicine 44, no. 03 (2005): 369–73. http://dx.doi.org/10.1055/s-0038-1633979.

Pełny tekst źródła
Streszczenie:
Summary Objectives: To show that Markov chain modelling can be applied to data on geriatric patients and use these models to assess the effects of covariates. Methods: Phase-type distributions were fitted by maximum likelihood to data on times spent by the patients in hospital and in community-based care. Data on the different events that ended the patients’ periods of care were used to estimate the dependence of the probabilities of these events on the phase from which the time in care ended. The age of the patients at admission to care and the year of admission were also included as covariat
Style APA, Harvard, Vancouver, ISO itp.
5

SINGHAL, EKTA, and Kunal Mehta. "Marketing Channel Attribution Modelling: Markov Chain Analysis." International Journal of Indian Culture and Business Management 1, no. 1 (2020): 1. http://dx.doi.org/10.1504/ijicbm.2020.10027991.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Mehta, Kunal, and Ekta Singhal. "Marketing channel attribution modelling: Markov chain analysis." International Journal of Indian Culture and Business Management 21, no. 1 (2020): 63. http://dx.doi.org/10.1504/ijicbm.2020.109344.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Hadjinicola, George, and Larry Goldstein. "Markov chain modelling of bioassay toxicity procedures." Statistics in Medicine 12, no. 7 (April 15, 1993): 661–74. http://dx.doi.org/10.1002/sim.4780120705.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Catak, Muammer, Nurşin Baş, Kevin Cronin, Dario Tellez-Medina, Edmond P. Byrne, and John J. Fitzpatrick. "Markov chain modelling of fluidised bed granulation." Chemical Engineering Journal 164, no. 2-3 (November 1, 2010): 403–9. http://dx.doi.org/10.1016/j.cej.2010.02.022.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Huang, Vincent, and James Unwin. "Markov chain models of refugee migration data." IMA Journal of Applied Mathematics 85, no. 6 (September 29, 2020): 892–912. http://dx.doi.org/10.1093/imamat/hxaa032.

Pełny tekst źródła
Streszczenie:
Abstract The application of Markov chains to modelling refugee crises is explored, focusing on local migration of individuals at the level of cities and days. As an explicit example, we apply the Markov chains migration model developed here to United Nations High Commissioner for Refugees data on the Burundi refugee crisis. We compare our method to a state-of-the-art ‘agent-based’ model of Burundi refugee movements, and highlight that Markov chain approaches presented here can improve the match to data while simultaneously being more algorithmically efficient.
Style APA, Harvard, Vancouver, ISO itp.
10

Balzter, Heiko. "Markov chain models for vegetation dynamics." Ecological Modelling 126, no. 2-3 (February 2000): 139–54. http://dx.doi.org/10.1016/s0304-3800(00)00262-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
11

Jin, Yongliang, and Amlan Mukherjee. "Markov chain applications in modelling facility condition deterioration." International Journal of Critical Infrastructures 10, no. 2 (2014): 93. http://dx.doi.org/10.1504/ijcis.2014.062965.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

LIU, PEIDONG, and YAN ZHENG. "MARKOV CHAIN PERTURBATIONS OF A CLASS OF PARTIALLY EXPANDING ATTRACTORS." Stochastics and Dynamics 06, no. 03 (September 2006): 341–54. http://dx.doi.org/10.1142/s0219493706001761.

Pełny tekst źródła
Streszczenie:
In this paper Markov chain perturbations of a class of partially expanding attractors of a diffeomorphism are considered. We show that, under some regularity conditions on the transition probabilities, the zero-noise limits of stationary measures of the Markov chains are Sinai–Ruelle–Bowen measures of the diffeomorphism on the attractors.
Style APA, Harvard, Vancouver, ISO itp.
13

Kuo, Lynn. "Markov Chain Monte Carlo." Technometrics 42, no. 2 (May 2000): 216. http://dx.doi.org/10.1080/00401706.2000.10486017.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Ge, Yuan, Yan Zhang, Wengen Gao, Fanyong Cheng, Nuo Yu, and Jincenzi Wu. "Modelling and Prediction of Random Delays in NCSs Using Double-Chain HMMs." Discrete Dynamics in Nature and Society 2020 (October 29, 2020): 1–16. http://dx.doi.org/10.1155/2020/6848420.

Pełny tekst źródła
Streszczenie:
This paper is concerned with the modelling and prediction of random delays in networked control systems. The stochastic distribution of the random delay in the current sampling period is assumed to be affected by the network state in the current sampling period as well as the random delay in the previous sampling period. Based on this assumption, the double-chain hidden Markov model (DCHMM) is proposed in this paper to model the delays. There are two Markov chains in this model. One is the hidden Markov chain which consists of the network states and the other is the observable Markov chain whi
Style APA, Harvard, Vancouver, ISO itp.
15

Melnik, Roderick V. Nicholas. "Dynamic system evolution and markov chain approximation." Discrete Dynamics in Nature and Society 2, no. 1 (1998): 7–39. http://dx.doi.org/10.1155/s1026022698000028.

Pełny tekst źródła
Streszczenie:
In this paper computational aspects of the mathematical modelling of dynamic system evolution have been considered as a problem in information theory. The construction of mathematical models is treated as a decision making process with limited available information.The solution of the problem is associated with a computational model based on heuristics of a Markov Chain in a discrete space–time of events. A stable approximation of the chain has been derived and the limiting cases are discussed. An intrinsic interconnection of constructive, sequential, and evolutionary approaches in related opt
Style APA, Harvard, Vancouver, ISO itp.
16

Brumnik, Robert, Podbregar Iztok, and Ferjancic-Podbregar Mojca. "Markov Chains Modelling for Biometric System Reliability Estimations in Supply Chain Management." Sensor Letters 11, no. 2 (February 1, 2013): 377–83. http://dx.doi.org/10.1166/sl.2013.2133.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
17

Sahin, Ahmet D., and Zekai Sen. "First-order Markov chain approach to wind speed modelling." Journal of Wind Engineering and Industrial Aerodynamics 89, no. 3-4 (March 2001): 263–69. http://dx.doi.org/10.1016/s0167-6105(00)00081-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
18

Ip, W. H., Bocheng Chen, Henry C. W. Lau, K. L. Choy, and S. L. Chan. "Modelling a CRM Markov chain process using system dynamics." International Journal of Value Chain Management 2, no. 4 (2008): 420. http://dx.doi.org/10.1504/ijvcm.2008.019849.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
19

Mardia, K. V., V. B. Nyirongo, A. N. Walder, C. Xu, P. A. Dowd, R. J. Fowell, and J. T. Kent. "Markov Chain Monte Carlo Implementation of Rock Fracture Modelling." Mathematical Geology 39, no. 4 (August 9, 2007): 355–81. http://dx.doi.org/10.1007/s11004-007-9099-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
20

Caleyo, F., J. C. Velázquez, A. Valor, and J. M. Hallen. "Markov chain modelling of pitting corrosion in underground pipelines." Corrosion Science 51, no. 9 (September 2009): 2197–207. http://dx.doi.org/10.1016/j.corsci.2009.06.014.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
21

Pereira, A. G. C., F. A. S. Sousa, B. B. Andrade, and Viviane Simioli Medeiros Campos. "Higher order Markov Chain Model for Synthetic Generation of Daily Streamflows." TEMA (São Carlos) 19, no. 3 (December 17, 2018): 449. http://dx.doi.org/10.5540/tema.2018.019.03.449.

Pełny tekst źródła
Streszczenie:
The aim of this study is to get further into the two-state Markov chain model for synthetic generation daily streamflows. The model proposed in Aksoy and Bayazit (2000) and Aksoy (2003) is based on a two Markov chains for determining the state of the stream. The ascension curve of the hydrograph is modeled by a two-parameter Gamma probability distribution function and is assumed that a recession curve of the hydrograph follows an exponentially function. In this work, instead of assuming a pre-defined order for the Markov chains involved in the modelling of streamflows, a BIC test is performed
Style APA, Harvard, Vancouver, ISO itp.
22

Quine, M. P., and J. S. Law. "Modelling random linear nucleation and growth by a Markov chain." Journal of Applied Probability 36, no. 01 (March 1999): 273–78. http://dx.doi.org/10.1017/s0021900200017034.

Pełny tekst źródła
Streszczenie:
In an attempt to investigate the adequacy of the normal approximation for the number of nuclei in certain growth/coverage models, we consider a Markov chain which has properties in common with related continuous-time Markov processes (as well as being of interest in its own right). We establish that the rate of convergence to normality for the number of ‘drops’ during times 1,2,…n is of the optimal ‘Berry–Esséen’ form, as n → ∞. We also establish a law of the iterated logarithm and a functional central limit theorem.
Style APA, Harvard, Vancouver, ISO itp.
23

Quine, M. P., and J. S. Law. "Modelling random linear nucleation and growth by a Markov chain." Journal of Applied Probability 36, no. 1 (March 1999): 273–78. http://dx.doi.org/10.1239/jap/1032374248.

Pełny tekst źródła
Streszczenie:
In an attempt to investigate the adequacy of the normal approximation for the number of nuclei in certain growth/coverage models, we consider a Markov chain which has properties in common with related continuous-time Markov processes (as well as being of interest in its own right). We establish that the rate of convergence to normality for the number of ‘drops’ during times 1,2,…n is of the optimal ‘Berry–Esséen’ form, as n → ∞. We also establish a law of the iterated logarithm and a functional central limit theorem.
Style APA, Harvard, Vancouver, ISO itp.
24

Ball, Frank, and Geoffrey F. Yeo. "Lumpability and marginalisability for continuous-time Markov chains." Journal of Applied Probability 30, no. 3 (September 1993): 518–28. http://dx.doi.org/10.2307/3214762.

Pełny tekst źródła
Streszczenie:
We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuou
Style APA, Harvard, Vancouver, ISO itp.
25

Acquah, Henry De-Graft. "Bayesian Logistic Regression Modelling via Markov Chain Monte Carlo Algorithm." Journal of Social and Development Sciences 4, no. 4 (April 30, 2013): 193–97. http://dx.doi.org/10.22610/jsds.v4i4.751.

Pełny tekst źródła
Streszczenie:
This paper introduces Bayesian analysis and demonstrates its application to parameter estimation of the logistic regression via Markov Chain Monte Carlo (MCMC) algorithm. The Bayesian logistic regression estimation is compared with the classical logistic regression. Both the classical logistic regression and the Bayesian logistic regression suggest that higher per capita income is associated with free trade of countries. The results also show a reduction of standard errors associated with the coefficients obtained from the Bayesian analysis, thus bringing greater stability to the coefficients.
Style APA, Harvard, Vancouver, ISO itp.
26

Pelkowitz, L. "The general markov chain disorder problem." Stochastics 21, no. 2 (June 1987): 113–30. http://dx.doi.org/10.1080/17442508708833454.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
27

Ball, Frank, and Geoffrey F. Yeo. "Lumpability and marginalisability for continuous-time Markov chains." Journal of Applied Probability 30, no. 03 (September 1993): 518–28. http://dx.doi.org/10.1017/s0021900200044272.

Pełny tekst źródła
Streszczenie:
We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X 1(t), X 2(t), · ··, Xm (t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X 1(t)}, {X 2(t)}, · ··, {Xm (t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are con
Style APA, Harvard, Vancouver, ISO itp.
28

Čech, Martin, and Radim Lenort. "MODELLING OF FINANCIAL RESOURCE ALLOCATION FOR INCREASING THE SUPPLY CHAIN RESILIENCE USING MARKOV CHAINS." Acta logistica 8, no. 2 (June 30, 2021): 141–51. http://dx.doi.org/10.22306/al.v8i2.213.

Pełny tekst źródła
Streszczenie:
The concept of supply chain resilience has arisen in response to changing conditions in the global market environment. Although supply chain resilience building is gaining increasing interest among the professional public and business practice, supporting decision-making in supply chain resilience building is still in its infancy. This article aims to present a mathematical model of the supply chain based on Markov chains to assess the impact of funds allocated to strengthening the supply chain’s resilience to its overall performance and thus support decision-making in the field. Mathematical
Style APA, Harvard, Vancouver, ISO itp.
29

Ball, Frank. "Central limit theorems for multivariate semi-Markov sequences and processes, with applications." Journal of Applied Probability 36, no. 02 (June 1999): 415–32. http://dx.doi.org/10.1017/s0021900200017228.

Pełny tekst źródła
Streszczenie:
In this paper, central limit theorems for multivariate semi-Markov sequences and processes are obtained, both as the number of jumps of the associated Markov chain tends to infinity and, if appropriate, as the time for which the process has been running tends to infinity. The theorems are widely applicable since many functions defined on Markov or semi-Markov processes can be analysed by exploiting appropriate embedded multivariate semi-Markov sequences. An application to a problem in ion channel modelling is described in detail. Other applications, including to multivariate stationary reward
Style APA, Harvard, Vancouver, ISO itp.
30

Ball, Frank. "Central limit theorems for multivariate semi-Markov sequences and processes, with applications." Journal of Applied Probability 36, no. 2 (June 1999): 415–32. http://dx.doi.org/10.1239/jap/1032374462.

Pełny tekst źródła
Streszczenie:
In this paper, central limit theorems for multivariate semi-Markov sequences and processes are obtained, both as the number of jumps of the associated Markov chain tends to infinity and, if appropriate, as the time for which the process has been running tends to infinity. The theorems are widely applicable since many functions defined on Markov or semi-Markov processes can be analysed by exploiting appropriate embedded multivariate semi-Markov sequences. An application to a problem in ion channel modelling is described in detail. Other applications, including to multivariate stationary reward
Style APA, Harvard, Vancouver, ISO itp.
31

Ball, Frank G., Robin K. Milne, and Geoffrey F. Yeo. "Marked Continuous-Time Markov Chain Modelling of Burst Behaviour for Single Ion Channels." Journal of Applied Mathematics and Decision Sciences 2007 (October 29, 2007): 1–14. http://dx.doi.org/10.1155/2007/48138.

Pełny tekst źródła
Streszczenie:
Patch clamp recordings from ion channels often show bursting behaviour, that is, periods of repetitive activity, which are noticeably separated from each other by periods of inactivity. A number of authors have obtained results for important properties of theoretical and empirical bursts when channel gating is modelled by a continuous-time Markov chain with a finite-state space. We show how the use of marked continuous-time Markov chains can simplify the derivation of (i) the distributions of several burst properties, including the total open time, the total charge transfer, and the number of
Style APA, Harvard, Vancouver, ISO itp.
32

Nkemnole, Edesiri Bridget, and Ekene Nwaokoro. "Modelling Customer Relationships as Hidden Markov Chains." Path of Science 6, no. 11 (November 30, 2020): 5011–19. http://dx.doi.org/10.22178/pos.64-9.

Pełny tekst źródła
Streszczenie:
Models in behavioural relationship marketing suggest that relations between the customer and the company change over time as a result of the continuous encounter. Some theoretical models have been put forward concerning relationship marketing, both from the standpoints of consumer behaviour and empirical modelling. In addition to these, this study proposes the hidden Markov model (HMM) as a potential tool for assessing customer relationships. Specifically, the HMM is submitted via the framework of a Markov chain model to classify customers relationship dynamics of a telecommunication service c
Style APA, Harvard, Vancouver, ISO itp.
33

Kwasniok, Frank. "Data-based stochastic subgrid-scale parametrization: an approach using cluster-weighted modelling." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 370, no. 1962 (March 13, 2012): 1061–86. http://dx.doi.org/10.1098/rsta.2011.0384.

Pełny tekst źródła
Streszczenie:
A new approach for data-based stochastic parametrization of unresolved scales and processes in numerical weather and climate prediction models is introduced. The subgrid-scale model is conditional on the state of the resolved scales, consisting of a collection of local models. A clustering algorithm in the space of the resolved variables is combined with statistical modelling of the impact of the unresolved variables. The clusters and the parameters of the associated subgrid models are estimated simultaneously from data. The method is implemented and explored in the framework of the Lorenz '96
Style APA, Harvard, Vancouver, ISO itp.
34

Lewy, P., and A. Nielsen. "Modelling stochastic fish stock dynamics using Markov Chain Monte Carlo." ICES Journal of Marine Science 60, no. 4 (January 1, 2003): 743–52. http://dx.doi.org/10.1016/s1054-3139(03)00080-8.

Pełny tekst źródła
Streszczenie:
Abstract A new age-structured stock dynamics approach including stochastic survival and recruitment processes is developed and implemented. The model is able to analyse detailed sources of information used in standard age-based fish stock assessment such as catch-at-age and effort data from commercial fleets and research surveys. The stock numbers are treated as unobserved variables subject to process errors while the catches are observed variables subject to both sampling and process errors. Results obtained for North Sea plaice using Markov Chain Monte Carlo methods indicate that the process
Style APA, Harvard, Vancouver, ISO itp.
35

Voskoglou, Michael G. "An application of Markov chain to the process of modelling." International Journal of Mathematical Education in Science and Technology 25, no. 4 (July 1994): 475–80. http://dx.doi.org/10.1080/0020739940250401.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Stowasser, Markus. "Modelling rain risk: a multi‐order Markov chain model approach." Journal of Risk Finance 13, no. 1 (December 30, 2011): 45–60. http://dx.doi.org/10.1108/15265941211191930.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Anderson, E. J. "Markov chain modelling of the solution surface in local search." Journal of the Operational Research Society 53, no. 6 (June 2002): 630–36. http://dx.doi.org/10.1057/palgrave/jors/2601342.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
38

Jasra, Ajay, David A. Stephens, Kerry Gallagher, and Christopher C. Holmes. "Bayesian Mixture Modelling in Geochronology via Markov Chain Monte Carlo." Mathematical Geology 38, no. 3 (April 2006): 269–300. http://dx.doi.org/10.1007/s11004-005-9019-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
39

Abundo, M., and L. Caramellino. "Some remarks on a Markov chain modelling cooperative biological systems." Open Systems & Information Dynamics 3, no. 3 (October 1995): 325–43. http://dx.doi.org/10.1007/bf02228996.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
40

Khan, Samiullah, and Mohammad Abdul Qadir. "Deterministic Time Markov Chain Modelling of Simultaneous Multipath Transmission Schemes." IEEE Access 5 (2017): 8536–44. http://dx.doi.org/10.1109/access.2017.2701769.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
41

Kennedy, Rodney A., and Shin-Ho Chung. "Modelling and identification of coupled markov chain model with application." International Journal of Adaptive Control and Signal Processing 10, no. 6 (November 1996): 623–34. http://dx.doi.org/10.1002/(sici)1099-1115(199611)10:6<623::aid-acs402>3.0.co;2-#.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
42

Kalashnikov, Vladimir V. "Regeneration and general Markov chains." Journal of Applied Mathematics and Stochastic Analysis 7, no. 3 (January 1, 1994): 357–71. http://dx.doi.org/10.1155/s1048953394000304.

Pełny tekst źródła
Streszczenie:
Ergodicity, continuity, finite approximations and rare visits of general Markov chains are investigated. The obtained results permit further quantitative analysis of characteristics, such as, rates of convergence, continuity (measured as a distance between perturbed and non-perturbed characteristics), deviations between Markov chains, accuracy of approximations and bounds on the distribution function of the first visit time to a chosen subset, etc. The underlying techniques use the embedding of the general Markov chain into a wide sense regenerative process with the help of splitting construct
Style APA, Harvard, Vancouver, ISO itp.
43

GASBARRA, DARIO, JOSÉ IGOR MORLANES, and ESKO VALKEILA. "INITIAL ENLARGEMENT IN A MARKOV CHAIN MARKET MODEL." Stochastics and Dynamics 11, no. 02n03 (September 2011): 389–413. http://dx.doi.org/10.1142/s021949371100336x.

Pełny tekst źródła
Streszczenie:
Enlargement of filtrations is a classical topic in the general theory of stochastic processes. This theory has been applied to stochastic finance in order to analyze models with insider information. In this paper we study initial enlargement in a Markov chain market model, introduced by Norberg. In the enlarged filtration, several things can happen: some of the jumps times can be accessible or predictable, but in the original filtration all the jumps times are totally inaccessible. But even if the jumps times change to accessible or predictable, the insider does not necessarily have arbitrage
Style APA, Harvard, Vancouver, ISO itp.
44

Fawcett, Lee, and David Walshaw. "Markov chain models for extreme wind speeds." Environmetrics 17, no. 8 (2006): 795–809. http://dx.doi.org/10.1002/env.794.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
45

Gerencsér, Balázs. "Markov chain mixing time on cycles." Stochastic Processes and their Applications 121, no. 11 (November 2011): 2553–70. http://dx.doi.org/10.1016/j.spa.2011.07.007.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
46

Imkeller, Peter, and Peter Kloeden. "On the Computation of Invariant Measures in Random Dynamical Systems." Stochastics and Dynamics 03, no. 02 (June 2003): 247–65. http://dx.doi.org/10.1142/s0219493703000711.

Pełny tekst źródła
Streszczenie:
Invariant measures of dynamical systems generated e.g. by difference equations can be computed by discretizing the originally continuum state space, and replacing the action of the generator by the transition mechanism of a Markov chain. In fact they are approximated by stationary vectors of these Markov chains. Here we extend this well-known approximation result and the underlying algorithm to the setting of random dynamical systems, i.e. dynamical systems on the skew product of a probability space carrying the underlying stationary stochasticity and the state space, a particular non-autonomo
Style APA, Harvard, Vancouver, ISO itp.
47

Keilson, J., and O. A. Vasicek. "Monotone measures of ergodicity for Markov chains." Journal of Applied Mathematics and Stochastic Analysis 11, no. 3 (January 1, 1998): 283–88. http://dx.doi.org/10.1155/s1048953398000239.

Pełny tekst źródła
Streszczenie:
The following paper, first written in 1974, was never published other than as part of an internal research series. Its lack of publication is unrelated to the merits of the paper and the paper is of current importance by virtue of its relation to the relaxation time. A systematic discussion is provided of the approach of a finite Markov chain to ergodicity by proving the monotonicity of an important set of norms, each measures of egodicity, whether or not time reversibility is present. The paper is of particular interest because the discussion of the relaxation time of a finite Markov chain [2
Style APA, Harvard, Vancouver, ISO itp.
48

Abdelkader, Eslam Mohammed, Tarek Zayed, and Mohamed Marzouk. "Modelling the Deterioration of Bridge Decks Based on Semi-Markov Decision Process." International Journal of Strategic Decision Sciences 10, no. 1 (January 2019): 23–45. http://dx.doi.org/10.4018/ijsds.2019010103.

Pełny tekst źródła
Streszczenie:
Deterioration models represent a very important pillar for the effective use of bridge management systems (BMS's). This article presents a probabilistic time-based model that predicts the condition ratings of the concrete bridge decks along their service life. The deterioration process of the concrete bridge decks is modeled using a semi-Markov decision process. The sojourn time of each condition state is fitted to a certain probability distribution based on some goodness of fit tests. The parameters of the probability density functions are obtained using maximum likelihood estimation. The cum
Style APA, Harvard, Vancouver, ISO itp.
49

Faghih-Roohi, Shahrzad, Min Xie, and Kien Ming Ng. "Accident risk assessment in marine transportation via Markov modelling and Markov Chain Monte Carlo simulation." Ocean Engineering 91 (November 2014): 363–70. http://dx.doi.org/10.1016/j.oceaneng.2014.09.029.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
50

Todorovic, P. "Remarks on a monotone Markov chain." Journal of Applied Mathematics and Simulation 1, no. 2 (January 1, 1987): 137–54. http://dx.doi.org/10.1155/s1048953388000103.

Pełny tekst źródła
Streszczenie:
In applications, considerations on stochastic models often involve a Markov chain {ζn}0∞ with state space in R+, and a transition probability Q. For each x R+ the support of Q(x,.) is [0,x]. This implies that ζ0≥ζ1≥…. Under certain regularity assumptions on Q we show that Qn(x,Bu)→1 as n→∞ for all u&gt;0 and that 1−Qn(x,Bu)≤[1−Q(x,Bu)]n where Bu=[0,u). Set τ0=max{k;ζk=ζ0}, τn=max{k;ζk=ζτn−1+1} and write Xn=ζτn−1+1, Tn=τn−τn−1. We investigate some properties of the imbedded Markov chain {Xn}0∞ and of {Tn}0∞. We determine all the marginal distributions of {Tn}0∞ and show that it is asymptoticall
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!