Academic literature on the topic 'Bayesian paradigms'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Bayesian paradigms.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Bayesian paradigms"

1

Liu, Zhi-Qiang. "Bayesian Paradigms in Image Processing." International Journal of Pattern Recognition and Artificial Intelligence 11, no. 01 (1997): 3–33. http://dx.doi.org/10.1142/s0218001497000020.

Full text
Abstract:
A large number of image and spatial information processing problems involves the estimation of the intrinsic image information from observed images, for instance, image restoration, image registration, image partition, depth estimation, shape reconstruction and motion estimation. These are inverse problems and generally ill-posed. Such estimation problems can be readily formulated by Bayesian models which infer the desired image information from the measured data. Bayesian paradigms have played a very important role in spatial data analysis for over three decades and have found many successful applications. In this paper, we discuss several aspects of Bayesian paradigms: uncertainty present in the observed image, prior distribution modeling, Bayesian-based estimation techniques in image processing, particularly, the maximum a posteriori estimator and the Kalman filtering theory, robustness, and Markov random fields and applications.
APA, Harvard, Vancouver, ISO, and other styles
2

Oaksford, Mike, and Nick Chater. "New Paradigms in the Psychology of Reasoning." Annual Review of Psychology 71, no. 1 (2020): 305–30. http://dx.doi.org/10.1146/annurev-psych-010419-051132.

Full text
Abstract:
The psychology of verbal reasoning initially compared performance with classical logic. In the last 25 years, a new paradigm has arisen, which focuses on knowledge-rich reasoning for communication and persuasion and is typically modeled using Bayesian probability theory rather than logic. This paradigm provides a new perspective on argumentation, explaining the rational persuasiveness of arguments that are logical fallacies. It also helps explain how and why people stray from logic when given deductive reasoning tasks. What appear to be erroneous responses, when compared against logic, often turn out to be rationally justified when seen in the richer rational framework of the new paradigm. Moreover, the same approach extends naturally to inductive reasoning tasks, in which people extrapolate beyond the data they are given and logic does not readily apply. We outline links between social and individual reasoning and set recent developments in the psychology of reasoning in the wider context of Bayesian cognitive science.
APA, Harvard, Vancouver, ISO, and other styles
3

Neupert, Shevaun D., Claire M. Growney, Xianghe Zhu, Julia K. Sorensen, Emily L. Smith, and Jan Hannig. "BFF: Bayesian, Fiducial, and Frequentist Analysis of Cognitive Engagement among Cognitively Impaired Older Adults." Entropy 23, no. 4 (2021): 428. http://dx.doi.org/10.3390/e23040428.

Full text
Abstract:
Engagement in cognitively demanding activities is beneficial to preserving cognitive health. Our goal was to demonstrate the utility of frequentist, Bayesian, and fiducial statistical methods for evaluating the robustness of effects in identifying factors that contribute to cognitive engagement for older adults experiencing cognitive decline. We collected a total of 504 observations across two longitudinal waves of data from 28 cognitively impaired older adults. Participants’ systolic blood pressure responsivity, an index of cognitive engagement, was continuously sampled during cognitive testing. Participants reported on physical and mental health challenges and provided hair samples to assess chronic stress at each wave. Using the three statistical paradigms, we compared results from six model testing levels and longitudinal changes in health and stress predicting changes in cognitive engagement. Findings were mostly consistent across the three paradigms, providing additional confidence in determining effects. We extend selective engagement theory to cognitive impairment, noting that health challenges and stress appear to be important moderators. Further, we emphasize the utility of the Bayesian and fiducial paradigms for use with relatively small sample sizes because they are not based on asymptotic distributions. In particular, the fiducial paradigm is a useful tool because it provides more information than p values without the need to specify prior distributions, which may unduly influence the results based on a small sample. We provide the R code used to develop and implement all models.
APA, Harvard, Vancouver, ISO, and other styles
4

Ly, Alexander, Akash Raj, Alexander Etz, Maarten Marsman, Quentin F. Gronau, and Eric-Jan Wagenmakers. "Bayesian Reanalyses From Summary Statistics: A Guide for Academic Consumers." Advances in Methods and Practices in Psychological Science 1, no. 3 (2018): 367–74. http://dx.doi.org/10.1177/2515245918779348.

Full text
Abstract:
Across the social sciences, researchers have overwhelmingly used the classical statistical paradigm to draw conclusions from data, often focusing heavily on a single number: p. Recent years, however, have witnessed a surge of interest in an alternative statistical paradigm: Bayesian inference, in which probabilities are attached to parameters and models. We feel it is informative to provide statistical conclusions that go beyond a single number, and—regardless of one’s statistical preference—it can be prudent to report the results from both the classical and the Bayesian paradigms. In order to promote a more inclusive and insightful approach to statistical inference, we show how the Summary Stats module in the open-source software program JASP ( https://jasp-stats.org ) can provide comprehensive Bayesian reanalyses from just a few commonly reported summary statistics, such as t and N. These Bayesian reanalyses allow researchers—and also editors, reviewers, readers, and reporters—to (a) quantify evidence on a continuous scale using Bayes factors, (b) assess the robustness of that evidence to changes in the prior distribution, and (c) gauge which posterior parameter ranges are more credible than others by examining the posterior distribution of the effect size. The procedure is illustrated using Festinger and Carlsmith’s (1959) seminal study on cognitive dissonance.
APA, Harvard, Vancouver, ISO, and other styles
5

Bojinov, Iavor I., Natesh S. Pillai, and Donald B. Rubin. "Diagnosing missing always at random in multivariate data." Biometrika 107, no. 1 (2019): 246–53. http://dx.doi.org/10.1093/biomet/asz061.

Full text
Abstract:
Summary Models for analysing multivariate datasets with missing values require strong, often unassessable, assumptions. The most common of these is that the mechanism that created the missing data is ignorable, which is a two-fold assumption dependent on the mode of inference. The first part, which is the focus here, under the Bayesian and direct-likelihood paradigms requires that the missing data be missing at random; in contrast, the frequentist-likelihood paradigm demands that the missing data mechanism always produce missing at random data, a condition known as missing always at random. Under certain regularity conditions, assuming missing always at random leads to a condition that can be tested using the observed data alone, namely that the missing data indicators depend only on fully observed variables. In this note we propose three different diagnostic tests that not only indicate when this assumption is incorrect but also suggest which variables are the most likely culprits. Although missing always at random is not a necessary condition to ensure validity under the Bayesian and direct-likelihood paradigms, it is sufficient, and evidence of its violation should encourage the careful statistician to conduct targeted sensitivity analyses.
APA, Harvard, Vancouver, ISO, and other styles
6

Alotaibi, Refah, Lamya A. Baharith, Ehab M. Almetwally, Mervat Khalifa, Indranil Ghosh, and Hoda Rezk. "Statistical Inference on a Finite Mixture of Exponentiated Kumaraswamy-G Distributions with Progressive Type II Censoring Using Bladder Cancer Data." Mathematics 10, no. 15 (2022): 2800. http://dx.doi.org/10.3390/math10152800.

Full text
Abstract:
A new family of distributions called the mixture of the exponentiated Kumaraswamy-G (henceforth, in short, ExpKum-G) class is developed. We consider Weibull distribution as the baseline (G) distribution to propose and study this special sub-model, which we call the exponentiated Kumaraswamy Weibull distribution. Several useful statistical properties of the proposed ExpKum-G distribution are derived. Under the classical paradigm, we consider the maximum likelihood estimation under progressive type II censoring to estimate the model parameters. Under the Bayesian paradigm, independent gamma priors are proposed to estimate the model parameters under progressive type II censored samples, assuming several loss functions. A simulation study is carried out to illustrate the efficiency of the proposed estimation strategies under both classical and Bayesian paradigms, based on progressively type II censoring models. For illustrative purposes, a real data set is considered that exhibits that the proposed model in the new class provides a better fit than other types of finite mixtures of exponentiated Kumaraswamy-type models.
APA, Harvard, Vancouver, ISO, and other styles
7

Neupert, Shevaun D., and Jan Hannig. "BFF: Bayesian, Fiducial, Frequentist Analysis of Age Effects in Daily Diary Data." Journals of Gerontology: Series B 75, no. 1 (2019): 67–79. http://dx.doi.org/10.1093/geronb/gbz100.

Full text
Abstract:
Abstract Objectives We apply new statistical models to daily diary data to advance both methodological and conceptual goals. We examine age effects in within-person slopes in daily diary data and introduce Generalized Fiducial Inference (GFI), which provides a compromise between frequentist and Bayesian inference. We use daily stressor exposure data across six domains to generate within-person emotional reactivity slopes with daily negative affect. We test for systematic age differences and similarities in these reactivity slopes, which are inconsistent in previous research. Method One hundred and eleven older (aged 60–90) and 108 younger (aged 18–36) adults responded to daily stressor and negative affect questions each day for eight consecutive days, resulting in 1,438 total days. Daily stressor domains included arguments, avoided arguments, work/volunteer stressors, home stressors, network stressors, and health-related stressors. Results Using Bayesian, GFI, and frequentist paradigms, we compared results for the six stressor domains with a focus on interpreting age effects in within-person reactivity. Multilevel models suggested null age effects in emotional reactivity across each of the paradigms within the domains of avoided arguments, work/volunteer stressors, home stressors, and health-related stressors. However, the models diverged with respect to null age effects in emotional reactivity to arguments and network stressors. Discussion The three paradigms converged on null age effects in reactivity for four of the six stressor domains. GFI is a useful tool that provides additional information when making determinations regarding null age effects in within-person slopes. We provide the code for readers to apply these models to their own data.
APA, Harvard, Vancouver, ISO, and other styles
8

Kabanda, Gabriel. "Bayesian Network Model for a Zimbabwean Cybersecurity System." Oriental journal of computer science and technology 12, no. 4 (2020): 147–67. http://dx.doi.org/10.13005/ojcst12.04.02.

Full text
Abstract:
The purpose of this research was to develop a structure for a network intrusion detection and prevention system based on the Bayesian Network for use in Cybersecurity. The phenomenal growth in the use of internet-based technologies has resulted in complexities in cybersecurity subjecting organizations to cyberattacks. What is required is a network intrusion detection and prevention system based on the Bayesian Network structure for use in Cybersecurity. Bayesian Networks (BNs) are defined as graphical probabilistic models for multivariate analysis and are directed acyclic graphs that have an associated probability distribution function. The research determined the cybersecurity framework appropriate for a developing nation; evaluated network detection and prevention systems that use Artificial Intelligence paradigms such as finite automata, neural networks, genetic algorithms, fuzzy logic, support-vector machines or diverse data-mining-based approaches; analysed Bayesian Networks that can be represented as graphical models and are directional to represent cause-effect relationships; and developed a Bayesian Network model that can handle complexity in cybersecurity. The theoretical framework on Bayesian Networks was largely informed by the NIST Cybersecurity Framework, General deterrence theory, Game theory, Complexity theory and data mining techniques. The Pragmatism paradigm used in this research, as a philosophy is intricately related to the Mixed Method Research (MMR). A mixed method approach was used in this research, which is largely quantitative with the research design being a survey and an experiment, but supported by qualitative approaches where Focus Group discussions were held. The performance of Support Vector Machines, Artificial Neural Network, K-Nearest Neighbour, Naive-Bayes and Decision Tree Algorithms was discussed. Alternative improved solutions discussed include the use of machine learning algorithms specifically Artificial Neural Networks (ANN), Decision Tree C4.5, Random Forests and Support Vector Machines (SVM).
APA, Harvard, Vancouver, ISO, and other styles
9

Laurens, Jean, Dominik Straumann, and Bernhard J. M. Hess. "Processing of Angular Motion and Gravity Information Through an Internal Model." Journal of Neurophysiology 104, no. 3 (2010): 1370–81. http://dx.doi.org/10.1152/jn.00143.2010.

Full text
Abstract:
The vestibular organs in the base of the skull provide important information about head orientation and motion in space. Previous studies have suggested that both angular velocity information from the semicircular canals and information about head orientation and translation from the otolith organs are centrally processed in an internal model of head motion, using the principles of optimal estimation. This concept has been successfully applied to model behavioral responses to classical vestibular motion paradigms. This study measured the dynamic of the vestibuloocular reflex during postrotatory tilt, tilt during the optokinetic afternystagmus, and off-vertical axis rotation. The influence of otolith signal on the VOR was systematically varied by using a series of tilt angles. We found that the time constants of responses varied almost identically as a function of gravity in these paradigms. We show that Bayesian modeling could predict the experimental results in an accurate and consistent manner. In contrast to other approaches, the Bayesian model also provides a plausible explanation of why these vestibulooculo motor responses occur as a consequence of an internal process of optimal motion estimation.
APA, Harvard, Vancouver, ISO, and other styles
10

Guo, Jeff, Bojana Ranković, and Philippe Schwaller. "Bayesian Optimization for Chemical Reactions." CHIMIA 77, no. 1/2 (2023): 31. http://dx.doi.org/10.2533/chimia.2023.31.

Full text
Abstract:
Reaction optimization is challenging and traditionally delegated to domain experts who iteratively propose increasingly optimal experiments. Problematically, the reaction landscape is complex and often requires hundreds of experiments to reach convergence, representing an enormous resource sink. Bayesian optimization (BO) is an optimization algorithm that recommends the next experiment based on previous observations and has recently gained considerable interest in the general chemistry community. The application of BO for chemical reactions has been demonstrated to increase efficiency in optimization campaigns and can recommend favorable reaction conditions amidst many possibilities. Moreover, its ability to jointly optimize desired objectives such as yield and stereoselectivity makes it an attractive alternative or at least complementary to domain expert-guided optimization. With the democratization of BO software, the barrier of entry to applying BO for chemical reactions has drastically lowered. The intersection between the paradigms will see advancements at an ever-rapid pace. In this review, we discuss how chemical reactions can be transformed into machine-readable formats which can be learned by machine learning (ML) models. We present a foundation for BO and how it has already been applied to optimize chemical reaction outcomes. The important message we convey is that realizing the full potential of ML-augmented reaction optimization will require close collaboration between experimentalists and computational scientists.
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!