To see the other types of publications on this topic, follow the link: Bayesian intelligence.

Journal articles on the topic 'Bayesian intelligence'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Bayesian intelligence.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zelterman, Daniel. "Bayesian Artificial Intelligence." Technometrics 47, no. 1 (2005): 101–2. http://dx.doi.org/10.1198/tech.2005.s836.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ramoni, Marco F. "Bayesian Artificial Intelligence." Journal of the American Statistical Association 100, no. 471 (2005): 1096–97. http://dx.doi.org/10.1198/jasa.2005.s39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

V. Jensen, Finn. "Bayesian Artificial Intelligence." Pattern Analysis and Applications 7, no. 2 (2004): 221–23. http://dx.doi.org/10.1007/s10044-004-0214-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Vreeswijk, Gerard A. W. "Book Review: Bayesian Artificial Intelligence." Artificial Intelligence and Law 11, no. 4 (2003): 289–98. http://dx.doi.org/10.1023/b:arti.0000045970.25670.25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pascual-Garcia, Erica, and Guillermo De la Torre-Gea. "Bayesian Analysis to the experiences of corruption through Artificial Intelligence." International Journal of Trend in Scientific Research and Development Volume-2, Issue-2 (2018): 103–7. http://dx.doi.org/10.31142/ijtsrd2443.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Geetha, Dr V., Dr C. K. Gomathy, Mr S. Aravind, and V. Venkata Surya. "UNDERSTANDING BAYES RULE: BAYESIAN NETWORKS IN ARTIFICIAL INTELLIGENCE." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 07, no. 11 (2023): 1–11. http://dx.doi.org/10.55041/ijsrem27093.

Full text
Abstract:
Bayes' Rule and Bayesian Networks are foundational elements of AI, enabling probabilistic reasoning and informed decision-making in uncertain domains. This article introduces the core concepts and practical applications of these tools. We explore the historical origins, step-by-step construction of Bayesian Networks, and real-world AI applications. By understanding Bayes' Rule and Bayesian Networks, readers can unlock their potential to tackle complex AI challenges and uncertainties. And this article underscores the undeniable importance of Bayes' Rule and Bayesian Networks in AI. We hope to i
APA, Harvard, Vancouver, ISO, and other styles
7

Muhsina, Elvanisa Ayu, and Nurochman Nurochman. "SISTEM PAKAR REKOMENDASI PROFESI BERDASARKAN MULTIPLE INTELLIGENCES MENGGUNAKAN TEOREMA BAYESIAN." JISKA (Jurnal Informatika Sunan Kalijaga) 2, no. 1 (2017): 16. http://dx.doi.org/10.14421/jiska.2017.21-03.

Full text
Abstract:
Intelligence is perhaps to be the one of the most logical way to determine how smart people is. That fact has always been a problem at job because there are number of job that attract people but require a high GPA for them. Employee with high GPA doesn’t always fit in his skill and work role. They unable to understand and maintain their performance. This expert system is a necessary for recommend job using Intelligence. This research use a Bayesian theorem calculation to find out probability value and job recommendation. The value of MI (Multiple Intelligences)’s user, MI probability to a job
APA, Harvard, Vancouver, ISO, and other styles
8

TERZIYAN, VAGAN. "A BAYESIAN METANETWORK." International Journal on Artificial Intelligence Tools 14, no. 03 (2005): 371–84. http://dx.doi.org/10.1142/s0218213005002156.

Full text
Abstract:
Bayesian network (BN) is known to be one of the most solid probabilistic modeling tools. The theory of BN provides already several useful modifications of a classical network. Among those there are context-enabled networks such as multilevel networks or recursive multinets, which can provide separate BN modelling for different combinations of contextual features' values. The main challenge of this paper is the multilevel probabilistic meta-model (Bayesian Metanetwork), which is an extension of traditional BN and modification of recursive multinets. It assumes that interoperability between comp
APA, Harvard, Vancouver, ISO, and other styles
9

Pate-Cornell, Elisabeth. "Fusion of Intelligence Information: A Bayesian Approach." Risk Analysis 22, no. 3 (2002): 445–54. http://dx.doi.org/10.1111/0272-4332.00056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Angelopoulos, Nicos, and James Cussens. "Bayesian learning of Bayesian networks with informative priors." Annals of Mathematics and Artificial Intelligence 54, no. 1-3 (2008): 53–98. http://dx.doi.org/10.1007/s10472-009-9133-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Sanghai, S., P. Domingos, and D. Weld. "Relational Dynamic Bayesian Networks." Journal of Artificial Intelligence Research 24 (December 2, 2005): 759–97. http://dx.doi.org/10.1613/jair.1625.

Full text
Abstract:
Stochastic processes that involve the creation of objects and relations over time are widespread, but relatively poorly studied. For example, accurate fault diagnosis in factory assembly processes requires inferring the probabilities of erroneous assembly operations, but doing this efficiently and accurately is difficult. Modeled as dynamic Bayesian networks, these processes have discrete variables with very large domains and extremely high dimensionality. In this paper, we introduce relational dynamic Bayesian networks (RDBNs), which are an extension of dynamic Bayesian networks (DBNs) to fir
APA, Harvard, Vancouver, ISO, and other styles
12

Tang, Xiao-liang, and Min Han. "Semi-supervised Bayesian ARTMAP." Applied Intelligence 33, no. 3 (2009): 302–17. http://dx.doi.org/10.1007/s10489-009-0167-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Klami, Arto. "Bayesian object matching." Machine Learning 92, no. 2-3 (2013): 225–50. http://dx.doi.org/10.1007/s10994-013-5357-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Rosman, Benjamin, Majd Hawasly, and Subramanian Ramamoorthy. "Bayesian policy reuse." Machine Learning 104, no. 1 (2016): 99–127. http://dx.doi.org/10.1007/s10994-016-5547-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

VÉRONIQUE, DELCROIX, MAALEJ MOHAMED-AMINE, and PIECHOWIAK SYLVAIN. "BAYESIAN NETWORKS VERSUS OTHER PROBABILISTIC MODELS FOR THE MULTIPLE DIAGNOSIS OF LARGE DEVICES." International Journal on Artificial Intelligence Tools 16, no. 03 (2007): 417–33. http://dx.doi.org/10.1142/s0218213007003345.

Full text
Abstract:
Multiple diagnosis methods using Bayesian networks are rooted in numerous research projects about model-based diagnosis. Some of this research exploits probabilities to make a diagnosis. Many Bayesian network applications are used for medical diagnosis or for the diagnosis of technical problems in small or moderately large devices. This paper explains in detail the advantages of using Bayesian networks as graphic probabilistic models for diagnosing complex devices, and then compares such models with other probabilistic models that may or may not use Bayesian networks.
APA, Harvard, Vancouver, ISO, and other styles
16

LIU, WEI-YI, and KUN YUE. "BAYESIAN NETWORK WITH INTERVAL PROBABILITY PARAMETERS." International Journal on Artificial Intelligence Tools 20, no. 05 (2011): 911–39. http://dx.doi.org/10.1142/s0218213011000449.

Full text
Abstract:
Interval data are widely used in real applications to represent the values of quantities in uncertain situations. However, the implied probabilistic causal relationships among interval-valued variables with interval data cannot be represented and inferred by general Bayesian networks with point-based probability parameters. Thus, it is desired to extend the general Bayesian network with effective mechanisms of representation, learning and inference of probabilistic causal relationships implied in interval data. In this paper, we define the interval probabilities, the bound-limited weak conditi
APA, Harvard, Vancouver, ISO, and other styles
17

Francis, George, and Emil O. W. Kirkegaard. "National Intelligence and Economic Growth: A Bayesian Update." Mankind Quarterly 63, no. 1 (2022): 9–78. http://dx.doi.org/10.46469/mq.2022.63.1.2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Jun, Sunghae. "Frequentist and Bayesian Learning Approaches to Artificial Intelligence." International Journal of Fuzzy Logic and Intelligent Systems 16, no. 2 (2016): 111–18. http://dx.doi.org/10.5391/ijfis.2016.16.2.111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Garbolino, Paolo. "Bayesian theory and artificial intelligence: The quarrelsome marriage." International Journal of Man-Machine Studies 27, no. 5-6 (1987): 729–42. http://dx.doi.org/10.1016/s0020-7373(87)80027-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Sun, Xingping, Chang Chen, Lu Wang, Hongwei Kang, Yong Shen, and Qingyi Chen. "Hybrid Optimization Algorithm for Bayesian Network Structure Learning." Information 10, no. 10 (2019): 294. http://dx.doi.org/10.3390/info10100294.

Full text
Abstract:
Since the beginning of the 21st century, research on artificial intelligence has made great progress. Bayesian networks have gradually become one of the hotspots and important achievements in artificial intelligence research. Establishing an effective Bayesian network structure is the foundation and core of the learning and application of Bayesian networks. In Bayesian network structure learning, the traditional method of utilizing expert knowledge to construct the network structure is gradually replaced by the data learning structure method. However, as a result of the large amount of possibl
APA, Harvard, Vancouver, ISO, and other styles
21

Mesa, Juan Felipe Correa, and Juan Carlos Correa Morales. "From Artificial Intelligence and Bayesian Statistics to Neuroanatomy: Connections, Analogies, and Applications." Migration Letters 21, S1 (2023): 162–82. http://dx.doi.org/10.59670/ml.v21is1.6005.

Full text
Abstract:
This study examines the interaction between artificial intelligence (AI), Bayesian statistics, and various key brain structures, such as the hippocampus, amygdala, and thalamic nuclei. The goal is to explore how Bayesian inference can contribute to the development of AI systems that simulate and optimize essential aspects of human cognition, such as decision-making, attention, learning, and memory. By analyzing the functions of these structures within the framework of Bayesian statistics, possible avenues are identified for improving the adaptability and efficiency of AI systems in problem-sol
APA, Harvard, Vancouver, ISO, and other styles
22

Tang, Kewei, Zhixun Su, Jie Zhang, et al. "Bayesian rank penalization." Neural Networks 116 (August 2019): 246–56. http://dx.doi.org/10.1016/j.neunet.2019.04.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Monderer, D., and M. Tennenholtz. "Dynamic Non-Bayesian Decision Making." Journal of Artificial Intelligence Research 7 (November 1, 1997): 231–48. http://dx.doi.org/10.1613/jair.447.

Full text
Abstract:
The model of a non-Bayesian agent who faces a repeated game with incomplete information against Nature is an appropriate tool for modeling general agent-environment interactions. In such a model the environment state (controlled by Nature) may change arbitrarily, and the feedback/reward function is initially unknown. The agent is not Bayesian, that is he does not form a prior probability neither on the state selection strategy of Nature, nor on his reward function. A policy for the agent is a function which assigns an action to every history of observations and actions. Two basic feedback stru
APA, Harvard, Vancouver, ISO, and other styles
24

Hubin, Aliaksandr, Geir Storvik, and Florian Frommlet. "Flexible Bayesian Nonlinear Model Configuration." Journal of Artificial Intelligence Research 72 (November 22, 2021): 901–42. http://dx.doi.org/10.1613/jair.1.13047.

Full text
Abstract:
Regression models are used in a wide range of applications providing a powerful scientific tool for researchers from different fields. Linear, or simple parametric, models are often not sufficient to describe complex relationships between input variables and a response. Such relationships can be better described through flexible approaches such as neural networks, but this results in less interpretable models and potential overfitting. Alternatively, specific parametric nonlinear functions can be used, but the specification of such functions is in general complicated. In this paper, we introdu
APA, Harvard, Vancouver, ISO, and other styles
25

Yap, Ghim-Eng, Ah-Hwee Tan, and Hwee-Hwa Pang. "Explaining inferences in Bayesian networks." Applied Intelligence 29, no. 3 (2007): 263–78. http://dx.doi.org/10.1007/s10489-007-0093-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Kyburg, Henry E. "Bayesian and non-bayesian evidential updating." Artificial Intelligence 31, no. 3 (1987): 271–93. http://dx.doi.org/10.1016/0004-3702(87)90068-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ordyniak, S., and S. Szeider. "Parameterized Complexity Results for Exact Bayesian Network Structure Learning." Journal of Artificial Intelligence Research 46 (March 5, 2013): 263–302. http://dx.doi.org/10.1613/jair.3744.

Full text
Abstract:
Bayesian network structure learning is the notoriously difficult problem of discovering a Bayesian network that optimally represents a given set of training data. In this paper we study the computational worst-case complexity of exact Bayesian network structure learning under graph theoretic restrictions on the (directed) super-structure. The super-structure is an undirected graph that contains as subgraphs the skeletons of solution networks. We introduce the directed super-structure as a natural generalization of its undirected counterpart. Our results apply to several variants of score-based
APA, Harvard, Vancouver, ISO, and other styles
28

Frühwirth-Schnatter, Sylvia. "On fuzzy Bayesian inference." Fuzzy Sets and Systems 60, no. 1 (1993): 41–58. http://dx.doi.org/10.1016/0165-0114(93)90288-s.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Jamil, Waqas, and Abdelhamid Bouchachia. "Online Bayesian shrinkage regression." Neural Computing and Applications 32, no. 23 (2020): 17759–67. http://dx.doi.org/10.1007/s00521-020-04947-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Jing, Yushi, Vladimir Pavlović, and James M. Rehg. "Boosted Bayesian network classifiers." Machine Learning 73, no. 2 (2008): 155–84. http://dx.doi.org/10.1007/s10994-008-5065-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Khan, Suleiman A., Eemeli Leppäaho, and Samuel Kaski. "Bayesian multi-tensor factorization." Machine Learning 105, no. 2 (2016): 233–53. http://dx.doi.org/10.1007/s10994-016-5563-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

De Sousa Ribeiro, Fabio, Francesco Calivá, Mark Swainson, Kjartan Gudmundsson, Georgios Leontidis, and Stefanos Kollias. "Deep Bayesian Self-Training." Neural Computing and Applications 32, no. 9 (2019): 4275–91. http://dx.doi.org/10.1007/s00521-019-04332-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Codetta-Raiteri, Daniele. "Editorial for the Special Issue on “Bayesian Networks: Inference Algorithms, Applications, and Software Tools”." Algorithms 14, no. 5 (2021): 138. http://dx.doi.org/10.3390/a14050138.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Fujimaki, Ryohei, Takehisa Yairi, and Kazuo Machida. "Sparse Bayesian Learning for Nonstationary Data Sources." Transactions of the Japanese Society for Artificial Intelligence 23 (2008): 50–57. http://dx.doi.org/10.1527/tjsai.23.50.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Motzek, Alexander, and Ralf Möller. "Indirect Causes in Dynamic Bayesian Networks Revisited." Journal of Artificial Intelligence Research 59 (May 27, 2017): 1–58. http://dx.doi.org/10.1613/jair.5361.

Full text
Abstract:
Modeling causal dependencies often demands cycles at a coarse-grained temporal scale. If Bayesian networks are to be used for modeling uncertainties, cycles are eliminated with dynamic Bayesian networks, spreading indirect dependencies over time and enforcing an infinitesimal resolution of time. Without a ``causal design,'' i.e., without anticipating indirect influences appropriately in time, we argue that such networks return spurious results. By identifying activator random variables, we propose activator dynamic Bayesian networks (ADBNs) which are able to rapidly adapt to contexts under a c
APA, Harvard, Vancouver, ISO, and other styles
36

Patil, Manaswi. "Bayesian Neural Networks for Enhanced Predictive Performance in Regression Problems." International Journal for Research in Applied Science and Engineering Technology 11, no. 11 (2023): 532–35. http://dx.doi.org/10.22214/ijraset.2023.56547.

Full text
Abstract:
Abstract: Bayesian statistics has a lot of influence on neural networks and deep learning for artificial intelligence (AI). The inference and learning of Bayesian statistics is based on prior, likelihood and posterior. The prior is the current belief of data field and the posterior is the updated belief after learning from observed data. By repeated learning using prior and posterior distributions, Bayesian statistics provides advanced data learning for AI. In this paper, we compare the previous Bayesian inference and learning methods for AI and propose a model based on Bayesian inference and
APA, Harvard, Vancouver, ISO, and other styles
37

KHREISAT, LAILA. "REAL TIME INFERENCE IN BAYESIAN NETWORKS: AN ANYTIME APPROACH." International Journal on Artificial Intelligence Tools 14, no. 03 (2005): 477–89. http://dx.doi.org/10.1142/s0218213005002211.

Full text
Abstract:
One of the major challenges facing real time world applications that employ Bayesian networks, is the design and development of efficient inference algorithms. In this paper we present an approximate real time inference algorithm for Bayesian Networks. The algorithm is an anytime reasoning method based on probabilistic inequalities, capable of handling fully and partially quantified Bayesian networks. In our method the accuracy of the results improve gradually as computation time increases, providing a trade-off between resource consumption and output quality. The method is tractable in provid
APA, Harvard, Vancouver, ISO, and other styles
38

JA, Duenas Santana. "Using Bayesian Networks for Quantifying Domino Effect Probability in a Hydrocarbon Processing Area." Petroleum & Petrochemical Engineering Journal 5, no. 3 (2021): 1–10. http://dx.doi.org/10.23880/ppej-16000274.

Full text
Abstract:
Accidents in process industries include fires, explosions, or toxic releases depending on the spilled material properties and ignition sources. One of the worst phenomena that may occur is the called domino effect. This triggers serious consequences on the people, the environment, and the economy. That is why the European Commission defined the domino effect prediction as a mandatory challenge for the years ahead. The quantification of the domino effect probability is a complex task due to the multiple and synergic effects among all accidents that should be included in the analysis. However, t
APA, Harvard, Vancouver, ISO, and other styles
39

Erica, Pascual-Garcia, and De la Torre-Gea Guillermo. "Bayesian Analysis to the experiences of corruption through Artificial Intelligence." International Journal of Trend in Scientific Research and Development 2, no. 2 (2018): 103–7. https://doi.org/10.31142/ijtsrd2443.

Full text
Abstract:
In a democracy, the link between government and society is fundamental to prevent corruption and to ensure the functioning of mechanisms for transparency, accountability of the guaranteeing bodies and access to information. This is why the State implements public policies on transparency. However, corruption has been a phenomenon that is present in the Public Administration and has not been able to diminish with these policies. The objective of this study is to analyze the variables on the perception of corruption in society using the Bayesian method. For this, the data were obtained from the
APA, Harvard, Vancouver, ISO, and other styles
40

Aussem, Alex. "Bayesian networks." Neurocomputing 73, no. 4-6 (2010): 561–62. http://dx.doi.org/10.1016/j.neucom.2009.11.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Halpern, J. Y. "Conditional Plausibility Measures and Bayesian Networks." Journal of Artificial Intelligence Research 14 (June 1, 2001): 359–89. http://dx.doi.org/10.1613/jair.817.

Full text
Abstract:
A general notion of algebraic conditional plausibility measures is defined. Probability measures, ranking functions, possibility measures, and (under the appropriate definitions) sets of probability measures can all be viewed as defining algebraic conditional plausibility measures. It is shown that algebraic conditional plausibility measures can be represented using Bayesian networks.
APA, Harvard, Vancouver, ISO, and other styles
42

Ahn, Jae Joon, Hyun Woo Byun, Kyong Joo Oh, and Tae Yoon Kim. "Bayesian forecaster using class-based optimization." Applied Intelligence 36, no. 3 (2011): 553–63. http://dx.doi.org/10.1007/s10489-011-0275-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Salmani, Bahare, and Joost-Pieter Katoen. "Automatically Finding the Right Probabilities in Bayesian Networks." Journal of Artificial Intelligence Research 77 (August 27, 2023): 1637–96. http://dx.doi.org/10.1613/jair.1.14044.

Full text
Abstract:
This paper presents alternative techniques for inference on classical Bayesian networks in which all probabilities are fixed, and for synthesis problems when conditional probability tables (CPTs) in such networks contain symbolic parameters rather than concrete probabilities. The key idea is to exploit probabilistic model checking as well as its recent extension to parameter synthesis techniques for parametric Markov chains. To enable this, the Bayesian networks are transformed into Markov chains and their objectives are mapped onto probabilistic temporal logic formulas. For exact inference, w
APA, Harvard, Vancouver, ISO, and other styles
44

Yan, Dingqi, Qi Zhou, Jianzhou Wang, and Na Zhang. "Bayesian regularisation neural network based on artificial intelligence optimisation." International Journal of Production Research 55, no. 8 (2016): 2266–87. http://dx.doi.org/10.1080/00207543.2016.1237785.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Nadiri, Ata Allah, Nima Chitsazan, Frank T. C. Tsai, and Asghar Asghari Moghaddam. "Bayesian Artificial Intelligence Model Averaging for Hydraulic Conductivity Estimation." Journal of Hydrologic Engineering 19, no. 3 (2014): 520–32. http://dx.doi.org/10.1061/(asce)he.1943-5584.0000824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Schocken, S., and P. R. Kleindorfer. "Artificial intelligence dialects of the Bayesian belief revision language." IEEE Transactions on Systems, Man, and Cybernetics 19, no. 5 (1989): 1106–21. http://dx.doi.org/10.1109/21.44027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Durante, Daniele, Sally Paganin, Bruno Scarpa, and David B. Dunson. "Bayesian modelling of networks in complex business intelligence problems." Journal of the Royal Statistical Society: Series C (Applied Statistics) 66, no. 3 (2016): 555–80. http://dx.doi.org/10.1111/rssc.12168.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Hanif, Ayub, and Robert Elliott Smith. "State Space Modeling & Bayesian Inference with Computational Intelligence." New Mathematics and Natural Computation 11, no. 01 (2015): 71–101. http://dx.doi.org/10.1142/s1793005715500040.

Full text
Abstract:
Recursive Bayesian estimation using sequential Monte Carlos methods is a powerful numerical technique to understand latent dynamics of nonlinear non-Gaussian dynamical systems. It enables us to reason under uncertainty and addresses shortcomings underlying deterministic systems and control theories which do not provide sufficient means of performing analysis and design. In addition, parametric techniques such as the Kalman filter and its extensions, though they are computationally efficient, do not reliably compute states and cannot be used to learn stochastic problems. We review recursive Bay
APA, Harvard, Vancouver, ISO, and other styles
49

Imazawa, Kei, and Yoshiteru Katsumura. "Field Failure Prediction using Bayesian Network." Transactions of the Japanese Society for Artificial Intelligence 31, no. 2 (2016): L—D43_1–9. http://dx.doi.org/10.1527/tjsai.l-d43.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Ribeiro, Vitor P., Luiz Desuó Neto, Patricia A. A. Marques, et al. "A Stochastic Bayesian Artificial Intelligence Framework to Assess Climatological Water Balance under Missing Variables for Evapotranspiration Estimates." Agronomy 13, no. 12 (2023): 2970. http://dx.doi.org/10.3390/agronomy13122970.

Full text
Abstract:
The sustainable use of water resources is of utmost importance given climatological changes and water scarcity, alongside the many socioeconomic factors that rely on clean water availability, such as food security. In this context, developing tools to minimize water waste in irrigation is paramount for sustainable food production. The evapotranspiration estimate is a tool to evaluate the water volume required to achieve optimal crop yield with the least amount of water waste. The Penman-Monteith equation is the gold standard for this task, despite it becoming inapplicable if any of its require
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!