Academic literature on the topic 'Deviance information criterion'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Deviance information criterion.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Deviance information criterion"
Spiegelhalter, David J., Nicola G. Best, Bradley P. Carlin, and Angelika van der Linde. "The deviance information criterion: 12 years on." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 76, no. 3 (April 8, 2014): 485–93. http://dx.doi.org/10.1111/rssb.12062.
Full textBerg, Andreas, Renate Meyer, and Jun Yu. "Deviance Information Criterion for Comparing Stochastic Volatility Models." Journal of Business & Economic Statistics 22, no. 1 (January 2004): 107–20. http://dx.doi.org/10.1198/073500103288619430.
Full textKadhem, Safaa K., Paul Hewson, and Irene Kaimi. "Recursive Deviance Information Criterion for the Hidden Markov Model." International Journal of Statistics and Probability 5, no. 1 (December 22, 2015): 61. http://dx.doi.org/10.5539/ijsp.v5n1p61.
Full textQuintero, Adrian, and Emmanuel Lesaffre. "Comparing hierarchical models via the marginalized deviance information criterion." Statistics in Medicine 37, no. 16 (March 26, 2018): 2440–54. http://dx.doi.org/10.1002/sim.7649.
Full textShriner, Daniel, and Nengjun Yi. "Deviance information criterion (DIC) in Bayesian multiple QTL mapping." Computational Statistics & Data Analysis 53, no. 5 (March 2009): 1850–60. http://dx.doi.org/10.1016/j.csda.2008.01.016.
Full textChan, Joshua C. C., and Angelia L. Grant. "On the Observed-Data Deviance Information Criterion for Volatility Modeling." Journal of Financial Econometrics 14, no. 4 (April 6, 2016): 772–802. http://dx.doi.org/10.1093/jjfinec/nbw002.
Full textFung, Thomas, Joanna J. J. Wang, and Eugene Seneta. "The Deviance Information Criterion in Comparison of Normal Mixing Models." International Statistical Review 82, no. 3 (August 22, 2014): 411–21. http://dx.doi.org/10.1111/insr.12063.
Full textLi, Yong, Jun Yu, and Tao Zeng. "Deviance information criterion for latent variable models and misspecified models." Journal of Econometrics 216, no. 2 (June 2020): 450–93. http://dx.doi.org/10.1016/j.jeconom.2019.11.002.
Full textLiu, Haiyan, Sarah Depaoli, and Lydia Marvin. "Understanding the Deviance Information Criterion for SEM: Cautions in Prior Specification." Structural Equation Modeling: A Multidisciplinary Journal 29, no. 2 (November 17, 2021): 278–94. http://dx.doi.org/10.1080/10705511.2021.1994407.
Full textPooley, C. M., and G. Marion. "Bayesian model evidence as a practical alternative to deviance information criterion." Royal Society Open Science 5, no. 3 (March 2018): 171519. http://dx.doi.org/10.1098/rsos.171519.
Full textDissertations / Theses on the topic "Deviance information criterion"
Tran, Thu Trung. "Bayesian model estimation and comparison for longitudinal categorical data." Thesis, Queensland University of Technology, 2008. https://eprints.qut.edu.au/19240/1/Thu_Tran_Thesis.pdf.
Full textTran, Thu Trung. "Bayesian model estimation and comparison for longitudinal categorical data." Queensland University of Technology, 2008. http://eprints.qut.edu.au/19240/.
Full textSarini, Sarini. "Statistical methods for modelling falls and symptoms progression in patients with early stages of Parkinson's disease." Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/116208/1/_Sarini_Thesis.pdf.
Full textShahtahmassebi, Golnaz. "Bayesian modelling of ultra high-frequency financial data." Thesis, University of Plymouth, 2011. http://hdl.handle.net/10026.1/894.
Full textGrundler, Giulia. "Analisi ed estensione con criteri di preferenza di un algoritmo per process discovery di modelli dichiarativi." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.
Find full textChen, Carla Chia-Ming. "Bayesian methodology for genetics of complex diseases." Thesis, Queensland University of Technology, 2010. https://eprints.qut.edu.au/43357/1/Carla_Chen_Thesis.pdf.
Full textWang, Xiaokun 1979. "Capturing patterns of spatial and temporal autocorrelation in ordered response data : a case study of land use and air quality changes in Austin, Texas." Thesis, 2007. http://hdl.handle.net/2152/29686.
Full texttext
Mitsakakis, Nikolaos. "Bayesian Methods in Gaussian Graphical Models." Thesis, 2010. http://hdl.handle.net/1807/24831.
Full textChekouo, Tekougang Thierry. "Modélisation des bi-grappes et sélection des variables pour des données de grande dimension : application aux données d’expression génétique." Thèse, 2012. http://hdl.handle.net/1866/8946.
Full textClustering is a classical method to analyse gene expression data. When applied to the rows (e.g. genes), each column belongs to all clusters. However, it is often observed that the genes of a subset of genes are co-regulated and co-expressed in a subset of conditions, but behave almost independently under other conditions. For these reasons, biclustering techniques have been proposed to look for sub-matrices of a data matrix. Biclustering is a simultaneous clustering of rows and columns of a data matrix. Most of the biclustering algorithms proposed in the literature have no statistical foundation. It is interesting to pay attention to the underlying models of these algorithms and develop statistical models to obtain significant biclusters. In this thesis, we review some biclustering algorithms that seem to be most popular. We group these algorithms in accordance to the type of homogeneity in the bicluster and the type of overlapping that may be encountered. We shed light on statistical models that can justify these algorithms. It turns out that some techniques can be justified in a Bayesian framework. We develop an extension of the biclustering plaid model in a Bayesian framework and we propose a measure of complexity for biclustering. The deviance information criterion (DIC) is used to select the number of biclusters. Studies on gene expression data and simulated data give satisfactory results. To our knowledge, the biclustering algorithms assume that genes and experimental conditions are independent entities. These algorithms do not incorporate prior biological information that could be available on genes and conditions. We introduce a new Bayesian plaid model for gene expression data which integrates biological knowledge and takes into account the pairwise interactions between genes and between conditions via a Gibbs field. Dependence between these entities is made from relational graphs, one for genes and another for conditions. The graph of the genes and conditions is constructed by the k-nearest neighbors and allows to define a priori distribution of labels as auto-logistic models. The similarities of genes are calculated using gene ontology (GO). To estimate the parameters, we adopt a hybrid procedure that mixes MCMC with a variant of the Wang-Landau algorithm. Experiments on simulated and real data show the performance of our approach. It should be noted that there may be several variables of noise in microarray data. These variables may mask the true structure of the clustering. Inspired by the plaid model, we propose a model that simultaneously finds the true clustering structure and identifies discriminating variables. We propose a new model to solve the problem. It assumes that an observation can be explained by more than one cluster. This problem is addressed by using a binary latent vector, so the estimation is obtained via the Monte Carlo EM algorithm. Importance Sampling is used to reduce the computational cost of the Monte Carlo sampling at each step of the EM algorithm. Numerical examples demonstrate the usefulness of these methods in terms of variable selection and clustering.
Les simulations ont été implémentées avec le programme Java.
Book chapters on the topic "Deviance information criterion"
Wüthrich, Mario V., and Michael Merz. "Predictive Modeling and Forecast Evaluation." In Springer Actuarial, 75–110. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-12409-9_4.
Full textZeng, Tao, Yong Li, and Jun Yu. "Deviance Information Criterion for Comparing VAR Models." In Essays in Honor of Peter C. B. Phillips, 615–37. Emerald Group Publishing Limited, 2014. http://dx.doi.org/10.1108/s0731-905320140000033017.
Full textDonovan, Therese M., and Ruth M. Mickey. "The Survivor Problem Continued: Introduction to Bayesian Model Selection." In Bayesian Statistics for Beginners, 308–24. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198841296.003.0018.
Full textInchausti, Pablo. "Model Selection." In Statistical Modeling With R, 169–88. Oxford University PressOxford, 2022. http://dx.doi.org/10.1093/oso/9780192859013.003.0007.
Full textAnderson, Raymond A. "Stats & Maths & Unicorns." In Credit Intelligence & Modelling, 405–34. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780192844194.003.0011.
Full textConference papers on the topic "Deviance information criterion"
Doong, Shing H., and Tean Q. Lee. "Causal driver detection with deviance information criterion." In 2010 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2010. http://dx.doi.org/10.1109/icmlc.2010.5580778.
Full text