Journal articles on the topic 'Logic, paraconsistency, inconsistent information'

To see the other types of publications on this topic, follow the link: Logic, paraconsistency, inconsistent information.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Logic, paraconsistency, inconsistent information.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

GAO, TIANTIAN, PAUL FODOR, and MICHAEL KIFER. "Paraconsistency and word puzzles." Theory and Practice of Logic Programming 16, no. 5-6 (September 2016): 703–20. http://dx.doi.org/10.1017/s1471068416000326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractWord puzzles and the problem of their representations in logic languages have received considerable attention in the last decade (Ponnuruet al. 2004; Shapiro 2011; Baral and Dzifcak 2012; Schwitter 2013). Of special interest is the problem of generating such representations directly from natural language (NL) or controlled natural language (CNL). An interesting variation of this problem, and to the best of our knowledge, scarcely explored variation in this context, is when the input information is inconsistent. In such situations, the existing encodings of word puzzles produce inconsistent representations and break down. In this paper, we bring the well-known type of paraconsistent logics, calledAnnotated Predicate Calculus(APC) (Kifer and Lozinskii 1992), to bear on the problem. We introduce a new kind of non-monotonic semantics for APC, calledconsistency preferred stable modelsand argue that it makes APC into a suitable platform for dealing with inconsistency in word puzzles and, more generally, in NL sentences. We also devise a number of general principles to help the user choose among the different representations of NL sentences, which might seem equivalent but, in fact, behave differently when inconsistent information is taken into account. These principles can be incorporated into existing CNL translators, such as Attempto Controlled English (ACE) (Fuchset al. 2008) and PENG Light (White and Schwitter 2009). Finally, we show that APC with the consistency preferred stable model semantics can be equivalently embedded in ASP with preferences over stable models, and we use this embedding to implement this version of APC in Clingo (Gebseret al. 2011) and its Asprin add-on (Brewkaet al. 2015).
2

Kamide, Norihiro. "Inconsistency-Tolerant Multi-Agent Calculus." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 22, no. 06 (December 2014): 815–29. http://dx.doi.org/10.1142/s0218488514500433.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Verifying and specifying multi-agent systems in an appropriate inconsistency-tolerant logic are of growing importance in Computer Science since computer systems are generally used by or composed of inconsistency-tolerant multi-agents. In this paper, an inconsistency-tolerant logic for representing multi-agents is introduced as a Gentzen-type sequent calculus. This logic (or calculus) has multiple negation connectives that correspond to each agent, and these negation connectives have the property of paraconsistency that guarantees inconsistency-tolerance. The logic proposed is regarded as a modified generalization of trilattice logics, which are known to be useful for expressing fine-grained truth-values in computer networks. The completeness, cut-elimination and decidability theorems for the proposed logic (or sequent calculus) are proved as the main results of this paper.
3

Cocos, Cristian, Fahim Imam, and Wendy MacCaull. "Ontology Merging and Reasoning Using Paraconsistent Logics." International Journal of Knowledge-Based Organizations 2, no. 4 (October 2012): 35–51. http://dx.doi.org/10.4018/ijkbo.2012100103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Dealing with the inconsistencies that might arise during the ontology merging process constitutes a major challenge. The explosive nature of classical logic requires any logic-based merging effort to dissolve possible contradictions, and thus maintain consistency. In many cases, however, inconsistent information may be useful for intelligent reasoning activities. In healthcare systems, for example, inconsistent information may be required to provide a full clinical perspective, and thus any information loss is undesirable. The authors present a 4-valued logic-based merging system that exhibits inconsistency-tolerant behavior to avoid information loss.
4

Grooters, Diana, and Henry Prakken. "Two Aspects of Relevance in Structured Argumentation: Minimality and Paraconsistency." Journal of Artificial Intelligence Research 56 (June 15, 2016): 197–245. http://dx.doi.org/10.1613/jair.5058.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This paper studies two issues concerning relevance in structured argumentation in the context of the ASPIC+ framework, arising from the combined use of strict and defeasible inference rules. One issue arises if the strict inference rules correspond to classical logic. A longstanding problem is how the trivialising effect of the classical Ex Falso principle can be avoided while satisfying consistency and closure postulates. In this paper, this problem is solved by disallowing chaining of strict rules, resulting in a variant of the ASPIC+ framework called ASPIC*, and then disallowing the application of strict rules to inconsistent sets of formulas. Thus in effect Rescher & Manor's paraconsistent notion of weak consequence is embedded in ASPIC*. Another issue is minimality of arguments. If arguments can apply defeasible inference rules, then they cannot be required to have subset-minimal premises, since defeasible rules based on more information may well make an argument stronger. In this paper instead minimality is required of applications of strict rules throughout an argument. It is shown that under some plausible assumptions this does not affect the set of conclusions. In addition, circular arguments are in the new ASPIC* framework excluded in a way that satisfies closure and consistency postulates and that generates finitary argumentation frameworks if the knowledge base and set of defeasible rules are finite. For the latter result the exclusion of chaining of strict rules is essential. Finally, the combined results of this paper are shown to be a proper extension of classical-logic argumentation with preferences and defeasible rules.
5

Schwind, Nicolas, Sébastien Konieczny, and Ramón Pino Pérez. "On Paraconsistent Belief Revision in LP." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 5 (June 28, 2022): 5879–87. http://dx.doi.org/10.1609/aaai.v36i5.20532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Belief revision aims at incorporating, in a rational way, a new piece of information into the beliefs of an agent. Most works in belief revision suppose a classical logic setting, where the beliefs of the agent are consistent. Moreover, the consistency postulate states that the result of the revision should be consistent if the new piece of information is consistent. But in real applications it may easily happen that (some parts of) the beliefs of the agent are not consistent. In this case then it seems reasonable to use paraconsistent logics to derive sensible conclusions from these inconsistent beliefs. However, in this context, the standard belief revision postulates trivialize the revision process. In this work we discuss how to adapt these postulates when the underlying logic is Priest's LP logic, in order to model a rational change, while being a conservative extension of AGM/KM belief revision. This implies, in particular, to adequately adapt the notion of expansion. We provide a representation theorem and some examples of belief revision operators in this setting.
6

BAGAI, RAJIV, and RAJSHEKHAR SUNDERRAMAN. "COMPUTING THE WELL-FOUNDED MODEL OF DEDUCTIVE DATABASES." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 04, no. 02 (April 1996): 157–75. http://dx.doi.org/10.1142/s021848859600010x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The well-founded model is one of the most popular models of general logic programs, i.e. logic programs with negation in the bodies of clauses. We present a method for constructing this model for general deductive databases, which are logic programs without any function symbols. The method adopts paraconsistent relations as the semantic objects associated with the predicate symbols of the database. Paraconsistent relations are a generalization of ordinary relations in that they allow manipulation of incomplete as well as inconsistent information. The first step in the model construction method is to transform the database clauses into paraconsistent relation definitions involving these operators. The second step is to build the well-founded model iteratively. Algorithms for both steps are presented and their termination and correctness is also established.
7

HUNTER, ANTHONY. "Reasoning with inconsistency in structured text." Knowledge Engineering Review 15, no. 4 (December 2000): 317–37. http://dx.doi.org/10.1017/s0269888900002046.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Reasoning with inconsistency involves some compromise on classical logic. There is a range of proposals for logics (called paraconsistent logics) for reasoning with inconsistency each with pros and cons. Selecting an appropriate paraconsistent logic for an application depends upon the requirements of the application. Here we review paraconsistent logics for the potentially significant application area of technology for structured text. Structured text is a general concept that is implicit in a variety of approaches to handling information. Syntactically, an item of structured text is a number of grammatically simple phrases together with a semantic label for each phrase. Items of structured text may be nested within larger items of structured text. The semantic labels in a structured text are meant to parameterize a stereotypical situation, and so a particular item of structured text is an instance of that stereotypical situation. Much information is potentially available as structured text, including tagged text in XML, text in relational and object-oriented databases, and the output from information extraction systems in the form of instantiated templates. In this review paper, we formalize the concept of structured text, and then focus on how we can identify inconsistency in items of structured text, and reason with these inconsistencies. Then we review key approaches to paraconsistent reasoning, and discuss the application of them to reasoning with inconsistency in structured text.
8

Avron, Arnon, and Anna Zamansky. "Paraconsistency, self-extensionality, modality." Logic Journal of the IGPL 28, no. 5 (November 27, 2018): 851–80. http://dx.doi.org/10.1093/jigpal/jzy064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Paraconsistent logics are logics that, in contrast to classical and intuitionistic logic, do not trivialize inconsistent theories. In this paper we take a paraconsistent view on two famous modal logics: B and S5. We use for this a well-known general method for turning modal logics to paraconsistent logics by defining a new (paraconsistent) negation as $\neg \varphi =_{Def} \sim \Box \varphi$ (where $\sim$ is the classical negation). We show that while that makes both B and S5 members of the well-studied family of paraconsistent C-systems, they differ from most other C-systems in having the important replacement property (which means that equivalence of formulas implies their congruence). We further show that B is a very robust C-system in the sense that almost any axiom which has been considered in the context of C-systems is either already a theorem of B or its addition to B leads to a logic that is no longer paraconsistent. There is exactly one notable exception, and the result of adding this exception to B leads to the other logic studied here, S5.
9

Dubois, Didier, and Henri Prade. "Inconsistency Management from the Standpoint of Possibilistic Logic." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 23, Suppl. 1 (December 2015): 15–30. http://dx.doi.org/10.1142/s0218488515400024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Uncertainty and inconsistency pervade human knowledge. Possibilistic logic, where propositional logic formulas are associated with lower bounds of a necessity measure, handles uncertainty in the setting of possibility theory. Moreover, central in standard possibilistic logic is the notion of inconsistency level of a possibilistic logic base, closely related to the notion of consistency degree of two fuzzy sets introduced by L. A. Zadeh. Formulas whose weight is strictly above this inconsistency level constitute a sub-base free of any inconsistency. However, several extensions, allowing for a paraconsistent form of reasoning, or associating possibilistic logic formulas with information sources or subsets of agents, or extensions involving other possibility theory measures, provide other forms of inconsistency, while enlarging the representation capabilities of possibilistic logic. The paper offers a structured overview of the various forms of inconsistency that can be accommodated in possibilistic logic. This overview echoes the rich representation power of the possibility theory framework.
10

Nakayama, Yotaro, Seiki Akama, and Tetsuya Murai. "Bilattice Logic for Rough Sets." Journal of Advanced Computational Intelligence and Intelligent Informatics 24, no. 6 (November 20, 2020): 774–84. http://dx.doi.org/10.20965/jaciii.2020.p0774.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Rough set theory is studied to manage uncertain and inconsistent information. Because Pawlak’s decision logic for rough sets is based on the classical two-valued logic, it is inconvenient for handling inconsistent information. We propose a bilattice logic as the deduction basis for the decision logic of rough sets to address inconsistent and ambiguous information. To enhance the decision logic to bilattice semantics, we introduce Variable Precision Rough Set (VPRS). As a deductive basis for bilattice decision logic, we define a consequence relation for Belnap’s four-valued semantics and provide a bilattice semantic tableau TB4 for a deduction system. We demonstrate the soundness and completeness of TB4 and enhance it with weak negation.
11

NIEVES, JUAN CARLOS, MAURICIO OSORIO, and ULISES CORTÉS. "Semantics for Possibilistic Disjunctive Programs." Theory and Practice of Logic Programming 13, no. 1 (July 28, 2011): 33–70. http://dx.doi.org/10.1017/s1471068411000408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractIn this paper, a possibilistic disjunctive logic programming approach for modeling uncertain, incomplete, and inconsistent information is defined. This approach introduces the use of possibilistic disjunctive clauses, which are able to capture incomplete information and states of a knowledge base at the same time. By considering a possibilistic logic program as a possibilistic logic theory, a construction of a possibilistic logic programming semantic based on answer sets and the proof theory of possibilistic logic is defined. It shows that this possibilistic semantics for disjunctive logic programs can be characterized by a fixed-point operator. It is also shown that the suggested possibilistic semantics can be computed by a resolution algorithm and the consideration of optimal refutations from a possibilistic logic theory. In order to manage inconsistent possibilistic logic programs, a preference criterion between inconsistent possibilistic models is defined. In addition, the approach of cuts for restoring consistency of an inconsistent possibilistic knowledge base is adopted. The approach is illustrated in a medical scenario.
12

Bien, Zeungnam, and Wonseek Yu. "Extracting core information from inconsistent fuzzy control rules." Fuzzy Sets and Systems 71, no. 1 (April 1995): 95–111. http://dx.doi.org/10.1016/0165-0114(94)00191-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Шапиро, Ольга Александровна. "IS A “QUANTUM” THINKER INCONSISTENT?" Логико-философские штудии, no. 2 (September 24, 2022): 238–49. http://dx.doi.org/10.52119/lphs.2022.59.96.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Кто такой «квантовый» мыслитель? Это человек, сформировавшийся в условиях господства клиповой культуры. Его способ мысли заметно отличается от того, который был принят еще полвека назад, и представляет собой стиль мышления современной эпохи. Среди особенностей этого стиля мышления принято выделять постоянное переключение внимания между различными блоками информации, а также специфическую паранепротиворечивость. Но действительно ли квантовый мыслитель склонен нарушать логический закон противоречия? Анализ его рассуждений и убеждений, предпринятый в статье, показывает, что это не совсем так. Опираясь на идеи Аристотеля, а также их интерпретацию Я. Лукасевичем и Я. А. Слининым, автор приходит к выводу, что квантовый мыслитель, как и любой другой, стремится избегать внутренней противоречивости своих идей, однако делает это не при помощи отказа от одних убеждений в пользу других, но путем различения контекстов этих убеждений таким образом, что о них становится невозможно сказать, что они взяты в одно и то же время и в одном и том же отношении. Who is a “quantum” thinker? This is a person who was formed under the dominance of clip culture. His way of thinking is markedly different from that which was adopted half a century ago and represents the style of thinking of the modern era. Among the features of this style of thinking are the constant switching of attention between different blocks of information and specific paraconsistency. But is the quantum thinker really inclined to contravene the logical law of contradiction? The analysis of his reasoning and beliefs, undertaken in the article, shows that this is not entirely true. Based on the ideas of Aristotle, as well as their interpretation by Lukasevich and Slinin, the author comes to the conclusion that a quantum thinker, like any other, strives to avoid the internal inconsistency of his ideas, however, he does this not by rejecting some beliefs in favor of others but by distinguishing the contexts of these beliefs in such a way that they become it is impossible to say that they are taken at the same time and in the same respect.
14

Battigalli, Pierpaolo, and Giacomo Bonanno. "The Logic of Belief Persistence." Economics and Philosophy 13, no. 1 (April 1997): 39–59. http://dx.doi.org/10.1017/s0266267100004296.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The principle of belief persistence, or conservativity principle, states that ‘When changing beliefs in response to new evidence, you should continue to believe as many of the old beliefs as possible’ (Harman, 1986, p. 46). In particular, this means that if an individual gets new information, she has to accommodate it in her new belief set (the set of propositions she believes), and, if the new information is not inconsistent with the old belief set, then (1) the individual has to maintain all the beliefs she previously had and (2) the change should be minimal in the sense that every proposition in the new belief set must be deducible from the union of the old belief set and the new information (see, e.g., Gärdenfors, 1988; Stalnaker, 1984). We focus on this minimal notion of belief persistence and characterize it both semantically and syntactically.
15

Subrahmanian, V. S. "Y-Logic: A Framework for Reasoning About Chameleonic Programs with Inconsistent Completions." Fundamenta Informaticae 13, no. 4 (October 1, 1990): 465–83. http://dx.doi.org/10.3233/fi-1990-13405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Large logic programs are normally designed by teams of individuals, each of whom designs a subprogram. While each of these subprograms may have consistent completions, the logic program obtained by taking the union of these subprograms may not. However, the resulting program still serves a useful purpose, for a (possibly) very large subset of it still has a consistent completion. We argue that “small” inconsistencies may cause a logic program to have no models (in the traditional sense), even though it still serves some useful purpose. A semantics is developed in this paper for general logic programs which ascribes a very reasonable meaning to general logic programs irrespective of whether they have consistent (in the classical logic sense) completions.
16

Arieli, O., M. Denecker, B. Van Nuffelen, and M. Bruynooghe. "Coherent Integration of Databases by Abductive Logic Programming." Journal of Artificial Intelligence Research 21 (March 1, 2004): 245–86. http://dx.doi.org/10.1613/jair.1322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract: We introduce an abductive method for a coherent integration of independent data-sources. The idea is to compute a list of data-facts that should be inserted to the amalgamated database or retracted from it in order to restore its consistency. This method is implemented by an abductive solver, called Asystem, that applies SLDNFA-resolution on a meta-theory that relates different, possibly contradicting, input databases. We also give a pure model-theoretic analysis of the possible ways to `recover' consistent data from an inconsistent database in terms of those models of the database that exhibit as minimal inconsistent information as reasonably possible. This allows us to characterize the `recovered databases' in terms of the `preferred' (i.e., most consistent) models of the theory. The outcome is an abductive-based application that is sound and complete with respect to a corresponding model-based, preferential semantics, and -- to the best of our knowledge -- is more expressive (thus more general) than any other implementation of coherent integration of databases.
17

Tahara, Ikuo, and Shiho Nobesawa. "Three-valued logic for reasoning from an inconsistent knowledge base." Systems and Computers in Japan 37, no. 14 (2006): 44–51. http://dx.doi.org/10.1002/scj.20539.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Brenner, Joseph E. "Information in Reality: Logic and Metaphysics." tripleC: Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society 9, no. 2 (October 30, 2011): 332–41. http://dx.doi.org/10.31269/triplec.v9i2.282.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The recent history of information theory and science shows a trend in emphasis from quantitative measures to qualitative characterizations. In parallel, aspects of information are being developed, for example by Pedro Marijuan, Wolfgang Hofkirchner and others that are extending the notion of qualitative, non-computational information in the biological and cognitive domain to include meaning and function. However, there is as yet no consensus on whether a single acceptable definition or theory of the concept of information is possible, leading to many attempts to view it as a complex, a notion with varied meanings or a group of different entities. In my opinion, the difficulties in developing a Unified Theory of Information (UTI) that would include its qualitative and quantita-tive aspects and their relation to meaning are a consequence of implicit or explicit reliance on the principles of standard, truth-functional bivalent or multivalent logics. In reality, information processes, like those of time, change and human con-sciousness, are contradictory: they are regular and irregular; consistent and inconsistent; continuous and discontinuous. Since the indicated logics cannot accept real contradictions, they have been incapable of describing the multiple but interre-lated characteristics of information. The framework for the discussion of information in this paper will be the new extension of logic to real complex processes that I have made, Logic in Reality (LIR), which is grounded in the dualities and self-dualities of quantum physics and cos-mology. LIR provides, among other things, new interpretations of the most fundamental metaphysical questions present in discussions of information at physical, biological and cognitive levels of reality including, especially, those of time, continuity vs. discontinuity, and change, both physical and epistemological. I show that LIR can constitute a novel and general ap-proach to the non-binary properties of information, including meaning and value. These properties subsume the notion of semantic information as well-formed, meaningful and truthful data as proposed most recently by Luciano Floridi. LIR sup-ports the concept of ‘biotic’ information of Stuart Kauffmann, Robert Logan and their colleagues and that of meaningful information developed by Christophe Menant. Logic in Reality does not pretend to the level of rigor of an experimental or mathematical theory. It is proposed as a meth-odology to assist in achieving a minimum scientific legitimacy for a qualitative theory of information. My hope is that by seeing information, meaning and knowledge as dynamic processes, evolving according to logical rules in my extended sense of logic, some of the on-going issues on the nature and function of information may be clarified.
19

Brenner, Joseph E. "Information in Reality: Logic and Metaphysics." tripleC: Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society 9, no. 2 (October 30, 2011): 332–41. http://dx.doi.org/10.31269/vol9iss2pp332-341.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The recent history of information theory and science shows a trend in emphasis from quantitative measures to qualitative characterizations. In parallel, aspects of information are being developed, for example by Pedro Marijuan, Wolfgang Hofkirchner and others that are extending the notion of qualitative, non-computational information in the biological and cognitive domain to include meaning and function. However, there is as yet no consensus on whether a single acceptable definition or theory of the concept of information is possible, leading to many attempts to view it as a complex, a notion with varied meanings or a group of different entities. In my opinion, the difficulties in developing a Unified Theory of Information (UTI) that would include its qualitative and quantita-tive aspects and their relation to meaning are a consequence of implicit or explicit reliance on the principles of standard, truth-functional bivalent or multivalent logics. In reality, information processes, like those of time, change and human con-sciousness, are contradictory: they are regular and irregular; consistent and inconsistent; continuous and discontinuous. Since the indicated logics cannot accept real contradictions, they have been incapable of describing the multiple but interre-lated characteristics of information. The framework for the discussion of information in this paper will be the new extension of logic to real complex processes that I have made, Logic in Reality (LIR), which is grounded in the dualities and self-dualities of quantum physics and cos-mology. LIR provides, among other things, new interpretations of the most fundamental metaphysical questions present in discussions of information at physical, biological and cognitive levels of reality including, especially, those of time, continuity vs. discontinuity, and change, both physical and epistemological. I show that LIR can constitute a novel and general ap-proach to the non-binary properties of information, including meaning and value. These properties subsume the notion of semantic information as well-formed, meaningful and truthful data as proposed most recently by Luciano Floridi. LIR sup-ports the concept of ‘biotic’ information of Stuart Kauffmann, Robert Logan and their colleagues and that of meaningful information developed by Christophe Menant. Logic in Reality does not pretend to the level of rigor of an experimental or mathematical theory. It is proposed as a meth-odology to assist in achieving a minimum scientific legitimacy for a qualitative theory of information. My hope is that by seeing information, meaning and knowledge as dynamic processes, evolving according to logical rules in my extended sense of logic, some of the on-going issues on the nature and function of information may be clarified.
20

Eiter, Thomas, Michael Fink, and Daria Stepanova. "Computing Repairs of Inconsistent DL-Programs over EL Ontologies." Journal of Artificial Intelligence Research 56 (July 27, 2016): 463–515. http://dx.doi.org/10.1613/jair.5047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Description Logic (DL) ontologies and non-monotonic rules are two prominent Knowledge Representation (KR) formalisms with complementary features that are essential for various applications. Nonmonotonic Description Logic (DL) programs combine these formalisms thus providing support for rule-based reasoning on top of DL ontologies using a well-defined query interface represented by so-called DL-atoms. Unfortunately, interaction of the rules and the ontology may incur inconsistencies such that a DL-program lacks answer sets (i.e., models), and thus yields no information. This issue is addressed by recently defined repair answer sets, for computing which an effective practical algorithm was proposed for DL-Lite A ontologies that reduces a repair computation to constraint matching based on so-called support sets. However, the algorithm exploits particular features of DL-Lite A and can not be readily applied to repairing DL-programs over other prominent DLs like EL. compared to DL-Lite A , in EL support sets may neither be small nor only few support sets might exist, and completeness of the algorithm may need to be given up when the support information is bounded. We thus provide an approach for computing repairs for DL-programs over EL ontologies based on partial (incomplete) support families. The latter are constructed using datalog query rewriting techniques as well as ontology approximation based on logical difference between EL-terminologies. We show how the maximal size and number of support sets for a given DL-atom can be estimated by analyzing the properties of a support hypergraph, which characterizes a relevant set of TBox axioms needed for query derivation. We present a declarative implementation of the repair approach and experimentally evaluate it on a set of benchmark problems; the promising results witness practical feasibility of our repair approach.
21

Martínez Monterrubio, Sergio Mauricio, Juan Frausto Solis, and Raúl Monroy Borja. "EMRlog Method for Computer Security for Electronic Medical Records with Logic and Data Mining." BioMed Research International 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/542016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.
22

Lempp, Frieder. "A logic-based model for resolving conflicts." International Journal of Conflict Management 27, no. 1 (February 8, 2016): 116–39. http://dx.doi.org/10.1108/ijcma-11-2014-0081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Purpose – The purpose of this paper is to explore the extent to which formal logic can be applied to conflict analysis and resolution. It is motivated by the idea that conflicts can be understood as inconsistent sets of interests. Design/methodology/approach – A simple propositional model, based on propositional logic, which can be used to analyze conflicts, has been introduced and four algorithms have been presented to generate possible solutions to a conflict. The model is illustrated by applying it to the conflict between the Obama administration and the Syrian Government in September 2013 over the destruction of Syria’s chemical weapons programme. Findings – The author shows how different solutions, such as compromises, minimally invasive solutions or solutions compatible with certain pre-defined norms, can be generated by the model. It is shown how the model can operate in situations where the game-theoretic model fails due to a lack of information about the parties’ utility values. Research limitations/implications – The model can be used as a theoretical framework for future experimental research and/or to trace the course of particular conflict scenarios. Practical implications – The model can be used as the basis for building software applications for conflict resolution practitioners, such as negotiators or mediators. Originality/value – While the idea of using logic to analyse the structure of conflicts and generate possible solutions is not new to the field of conflict studies, the model presented in this paper provides a novel way of understanding conflicts for both researchers and practitioners.
23

Brenner, Joseph. "The Naturalization of Natural Philosophy." Philosophies 3, no. 4 (November 24, 2018): 41. http://dx.doi.org/10.3390/philosophies3040041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
A new demarcation is proposed between Natural Philosophy and non-Natural Philosophy—philosophy tout court—based on whether or not they follow a non-standard logic of real processes. This non-propositional logic, Logic in Reality (LIR), is based on the original work of the Franco-Romanian thinker Stéphane Lupasco (Bucharest, 1900–Paris, 1988). Many Natural Philosophies remain bounded by dependence on binary linguistic concepts of logic. I claim that LIR can naturalize—bring into science—part of such philosophies. Against the potential objection that my approach blurs the distinction between science and philosophy, I reply that there is no problem in differentiating experimental physical science and philosophy; any complete distinction between philosophy, including the philosophy of science(s) and the other sciences is invidious. It was historically unnecessary and is unnecessary today. The convergence of science and philosophy, proposed by Wu Kun based on implications of the philosophy of information, supports this position. LIR provides a rigorous basis for giving equivalent ontological value to diversity and identity, what is contradictory, inconsistent, absent, missing or past, unconscious, incomplete, and fuzzy as to their positive counterparts. The naturalized Natural Philosophy resulting from the application of these principles is a candidate for the ‘new synthesis’ called for by the editors.
24

CAROPRESE, LUCIANO, and ESTER ZUMPANO. "A Logic Framework for P2P Deductive Databases." Theory and Practice of Logic Programming 20, no. 1 (June 19, 2019): 1–43. http://dx.doi.org/10.1017/s1471068419000073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractThis paper presents a logic framework for modeling the interaction among deductive databases in a peer-to-peer (P2P) environment. Each peer joining a P2P systemprovides or imports datafrom its neighbors by using a set ofmapping rules, that is, a set of semantic correspondences to a set of peers belonging to the same environment. By using mapping rules, as soon as it enters the system, a peer can participate and access all data available in its neighborhood, and through its neighborhood it becomes accessible to all the other peers in the system. A query can be posed to any peer in the system and the answer is computed by using locally stored data and all the information that can be consistently imported from the neighborhood. Two different types of mapping rules are defined: mapping rules allowing to import a maximal set of atoms not leading to inconsistency (calledmaximal mapping rules) and mapping rules allowing to import a minimal set of atoms needed to restore consistency (calledminimal mapping rules). Implicitly, the use of maximal mapping rules statesit is preferable to import as long as no inconsistencies arise; whereas the use of minimal mapping rules states thatit is preferable not to import unless a inconsistency exists. The paper presents three different declarative semantics of a P2P system: (i) theMax Weak Model Semantics, in which mapping rules are used to importas much knowledge as possiblefrom a peer’s neighborhood without violating local integrity constraints; (ii) theMin Weak Model Semantics, in which the P2P system can be locally inconsistent and the information provided by the neighbors is used to restore consistency, that is, to only integrate the missing portion of a correct, but incomplete database; (iii) theMax-Min Weak Model Semanticsthat unifies the previous two different perspectives captured by the Max Weak Model Semantics and Min Weak Model Semantics. This last semantics allows to characterize each peer in the neighborhood as a resource used either to enrich (integrate) or to fix (repair) the knowledge, so as to define a kind ofintegrate–repairstrategy for each peer. For each semantics, the paper also introduces an equivalent and alternative characterization, obtained by rewriting each mapping rule into prioritized rules so as to model a P2P system as a prioritized logic program. Finally, results about the computational complexity of P2P logic queries are investigated by consideringbraveandcautiousreasoning.
25

Kandasamy, Ilanthenral. "Double-Valued Neutrosophic Sets, their Minimum Spanning Trees, and Clustering Algorithm." Journal of Intelligent Systems 27, no. 2 (March 28, 2018): 163–82. http://dx.doi.org/10.1515/jisys-2016-0088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractNeutrosophy (neutrosophic logic) is used to represent uncertain, indeterminate, and inconsistent information available in the real world. This article proposes a method to provide more sensitivity and precision to indeterminacy, by classifying the indeterminate concept/value into two based on membership: one as indeterminacy leaning towards truth membership and the other as indeterminacy leaning towards false membership. This paper introduces a modified form of a neutrosophic set, called Double-Valued Neutrosophic Set (DVNS), which has these two distinct indeterminate values. Its related properties and axioms are defined and illustrated in this paper. An important role is played by clustering in several fields of research in the form of data mining, pattern recognition, and machine learning. DVNS is better equipped at dealing with indeterminate and inconsistent information, with more accuracy, than the Single-Valued Neutrosophic Set, which fuzzy sets and intuitionistic fuzzy sets are incapable of. A generalised distance measure between DVNSs and the related distance matrix is defined, based on which a clustering algorithm is constructed. This article proposes a Double-Valued Neutrosophic Minimum Spanning Tree (DVN-MST) clustering algorithm, to cluster the data represented by double-valued neutrosophic information. Illustrative examples are given to demonstrate the applications and effectiveness of this clustering algorithm. A comparative study of the DVN-MST clustering algorithm with other clustering algorithms like Single-Valued Neutrosophic Minimum Spanning Tree, Intuitionistic Fuzzy Minimum Spanning Tree, and Fuzzy Minimum Spanning Tree is carried out.
26

Leiva, Mario A., Alejandro J. García, Paulo Shakarian, and Gerardo I. Simari . "Argumentation-Based Query Answering under Uncertainty with Application to Cybersecurity." Big Data and Cognitive Computing 6, no. 3 (August 26, 2022): 91. http://dx.doi.org/10.3390/bdcc6030091.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Decision support tools are key components of intelligent sociotechnical systems, and their successful implementation faces a variety of challenges, including the multiplicity of information sources, heterogeneous format, and constant changes. Handling such challenges requires the ability to analyze and process inconsistent and incomplete information with varying degrees of associated uncertainty. Moreover, some domains require the system’s outputs to be explainable and interpretable; an example of this is cyberthreat analysis (CTA) in cybersecurity domains. In this paper, we first present the P-DAQAP system, an extension of a recently developed query-answering platform based on defeasible logic programming (DeLP) that incorporates a probabilistic model and focuses on delivering these capabilities. After discussing the details of its design and implementation, and describing how it can be applied in a CTA use case, we report on the results of an empirical evaluation designed to explore the effectiveness and efficiency of a possible world sampling-based approximate query answering approach that addresses the intractability of exact computations.
27

Asenjo, F. G. "G. Priest and R. Routley. First historical introduction. A preliminary history of paraconsistent and dialethic approaches. Paraconsistent logic, Essays on the inconsistent, edited by Graham Priest, Richard Routley, and Jean Norman, Analytica, Philosophia Verlag, Munich, Hamden, and Vienna, 1989, pp. 3–75. - Ayda I. Arruda. Aspects of the historical development of paraconsistent logic. Paraconsistent logic, Essays on the inconsistent, edited by Graham Priest, Richard Routley, and Jean Norman, Analytica, Philosophia Verlag, Munich, Hamden, and Vienna, 1989, pp. 99–130. - G. Priest and R. Routley. Systems of paraconsistent logic. Paraconsistent logic, Essays on the inconsistent, edited by Graham Priest, Richard Routley, and Jean Norman, Analytica, Philosophia Verlag, Munich, Hamden, and Vienna, 1989, pp. 151–186. - G. Priest and R. Routley. Applications of paraconsistent logic. Paraconsistent logic, Essays on the inconsistent, edited by Graham Priest, Richard Routley, and Jean Norman, Analytica, Philosophia Verlag, Munich, Hamden, and Vienna, 1989, pp. 367–393. - G. Priest and R. Routley. The philosophical significance and inevitability of paraconsistency, Paraconsistent logic, Essays on the inconsistent, edited by Graham Priest, Richard Routley, and Jean Norman, Analytica, Philosophia Verlag, Munich, Hamden, and Vienna, 1989, pp. 483–539." Journal of Symbolic Logic 56, no. 4 (December 1991): 1503–4. http://dx.doi.org/10.2307/2275501.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Herrmann, Richard K. "How Attachments to the Nation Shape Beliefs About the World: A Theory of Motivated Reasoning." International Organization 71, S1 (April 2017): S61—S84. http://dx.doi.org/10.1017/s0020818316000382.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractIf competing beliefs about political events in the world stem largely from information asymmetries, then more information and knowledge should reduce the gap in competing perceptions. Empirical studies of decision making, however, often find just the reverse: as knowledge and the stakes in play go up, the beliefs about what is happening polarize rather than converge. The theory proposed here attributes this to motivated reasoning. Emotions inside the observer shape beliefs along with information coming from the outside world. A series of experiments embedded in a national survey of Americans finds that a primary driver of the beliefs someone forms about globalization, other countries, and the politics in the Middle East is how strongly they attach their social identity to the United States. Attachment produces more intense positive and negative emotions that in turn shape the interpretation of unfolding events and lead norms to be applied in an inconsistent fashion. People, in effect, rewrite reality around their favored course of action, marrying the logic of appropriateness to their own preferences. Beliefs, consequently, are not independent of preferences but related to them. Motivated reasoning, while not consistent with rational models, is predictable and can lead to expensive mistakes and double standards that undermine liberal internationalism.
29

Sun, Bo, Zhaojun Yang, Narayanaswamy Balakrishnan, Chuanhai Chen, Hailong Tian, and Wei Luo. "An Adaptive Bayesian Melding Method for Reliability Evaluation Via Limited Failure Data: An Application to the Servo Turret." Applied Sciences 10, no. 21 (October 28, 2020): 7591. http://dx.doi.org/10.3390/app10217591.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In the early stage of product development, reliability evaluation is an indispensable step before launching a product onto the market. It is not realistic to evaluate the reliability of a new product by a host of reliability tests due to the limiting factors of time and test costs. Evaluating the reliability of products in a short time is a challenging problem. In this paper, an approach is proposed that combines a group of experts’ judgments and limited failure data. Novel features of this approach are that it can reflect various kinds of information without considering the individual weight and reduces aggregation error in the uncertainty quantification of multiple inconsistent pieces of information. First, an expert system is established by the Bayesian best–worst method and fuzzy logic inference, which collects and aggregates a group of expert opinions to estimate the reliability improvement factor. Then, an adaptive Bayesian melding method is investigated to generate a posterior by inaccurate prior knowledge and limited test data; this method is made more computationally efficient by implementing an improved sampling importance resampling algorithm. Finally, an application for the reliability evaluation of a subsystem of a CNC lathe is discussed to illustrate the framework, which is shown to validate the reasonability and robustness of our proposal.
30

GÓMEZ, SERGIO ALEJANDRO, CARLOS IVÁN CHESÑEVAR, and GUILLERMO RICARDO SIMARI. "DEFEASIBLE REASONING IN WEB-BASED FORMS THROUGH ARGUMENTATION." International Journal of Information Technology & Decision Making 07, no. 01 (March 2008): 71–101. http://dx.doi.org/10.1142/s021962200800282x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The notion of forms as a way of organizing and presenting data has been used since the beginning of the World Wide Web. Web-based forms have evolved together with the development of new markup languages, in which it is possible to provide validation scripts as part of the form code to test whether the intended meaning of the form is correct. However, for the form designer, part of this intended meaning frequently involves other features which are not constraints by themselves, but rather attributes emerging from the form, which provide plausible conclusions in the context of incomplete and potentially inconsistent information. As the value of such attributes may change in presence of new knowledge, we call them defeasible attributes. In this paper, we propose extending traditional web-based forms to incorporate defeasible attributes as part of the knowledge that can be encoded by the form designer. The proposed extension allows the specification of scripts for reasoning about form fields using a defeasible knowledge base, expressed in terms of a Defeasible Logic Program.
31

Hu, Yingjie, Shouzhen Zeng, Llopis-Albert Carlos, Kifayat Ullah, and Yuqi Yang. "Social Network Group Decision-Making Method Based on Q-Rung Orthopair Fuzzy Set and Its Application in the Evaluation of Online Teaching Quality." Axioms 10, no. 3 (July 28, 2021): 168. http://dx.doi.org/10.3390/axioms10030168.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
As q-rung orthopair fuzzy set (q-ROFS) theory can effectively express complex fuzzy information, this study explores its application to social network environments and proposes a social network group decision-making (SNGDM) method based on the q-ROFS. Firstly, the q-rung orthopair fuzzy value is used to represent the trust relationships between experts in the social network, and a trust q-rung orthopair fuzzy value is defined. Secondly, considering the decreasing and multipath of trust in the process of trust propagation, this study designs a trust propagation mechanism by using its multiplication operation in the q-ROFS environment and proposes a trust q-ROFS aggregation approach. Moreover, based on the trust scores and confidence levels of experts, a new integration operator called q-rung orthopair fuzzy-induced ordered weighted average operator is proposed to fuse experts’ evaluation information. Additionally, considering the impact of consensus interaction on decision-making results, a consensus interaction model based on the q-ROF distance measure and trust relationship is proposed, including consistency measurement, identification of inconsistent expert decision-making opinions and a personalized adjustment mechanism. Finally, the SNGDM method is applied to solve the problem of evaluating online teaching quality.
32

Barragán, María, Raúl Montenegro, and María C. "Application of Neutrosophic Techniques for the Selection of the in-Hospital Triage System." International Journal of Neutrosophic Science 18, no. 4 (2022): 116–24. http://dx.doi.org/10.54216/ijns.180410.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Multicriteria decision problems are present in all branches of life and present a high degree of complexity in determining a feasible solution. In the public health sector, decisions are even more delicate because they work not only with the direct influence of human needs but also with limited financial resources. Saturation in hospital emergency services occurs when the need identified exceeds the resources available for patient care in the emergency unit. One of the elements of primary regulatory effect on saturation levels in emergency services is undoubtedly an adequate triage system. The present study presents the application of multicriteria evaluation techniques as a method for the best selection of different types of triage according to certain pre-established parameters of interest. For them, we rely on a method that combines the TODIM and PROMETHEE methods to obtain the results. In addition, the neutrosophic single value sets based on the neutrosophic logic are used so that the indeterminate and inconsistent information typical of the real world can be adequately handled. The application of the method used demonstrates the efficiency of this kind of method in solving complex problems in real life and in different fields of society, particularly
33

Al-Quran, Ashraf, Nasruddin Hassan, and Emad Marei. "A Novel Approach to Neutrosophic Soft Rough Set under Uncertainty." Symmetry 11, no. 3 (March 15, 2019): 384. http://dx.doi.org/10.3390/sym11030384.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
To handle indeterminate and incomplete data, neutrosophic logic/set/probability were established. The neutrosophic truth, falsehood and indeterminacy components exhibit symmetry as the truth and the falsehood look the same and behave in a symmetrical way with respect to the indeterminacy component which serves as a line of the symmetry. Soft set is a generic mathematical tool for dealing with uncertainty. Rough set is a new mathematical tool for dealing with vague, imprecise, inconsistent and uncertain knowledge in information systems. This paper introduces a new rough set model based on neutrosophic soft set to exploit simultaneously the advantages of rough sets and neutrosophic soft sets in order to handle all types of uncertainty in data. The idea of neutrosophic right neighborhood is utilised to define the concepts of neutrosophic soft rough (NSR) lower and upper approximations. Properties of suggested approximations are proposed and subsequently proven. Some of the NSR set concepts such as NSR-definability, NSR-relations and NSR-membership functions are suggested and illustrated with examples. Further, we demonstrate the feasibility of the newly rough set model with decision making problems involving neutrosophic soft set. Finally, a discussion on the features and limitations of the proposed model is provided.
34

Lee, Chuo-Hsuan, Edward J. Lusk, Karen Naaman, and Osamuyimen Omorogbe-Akpata. "Alignment Vetting of Bloomberg’s ISS: QualityScore [GQS]: Frequency of Provision of ESG & Related Disclosure Scores." International Journal of Economics and Finance 14, no. 12 (November 5, 2022): 40. http://dx.doi.org/10.5539/ijef.v14n12p40.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Context The Environment, Social, and Governance [ESGÓ]-platform offered by BloombergÔ Professional Services [https://www.bloomberg.com/professional/] is a leading source of relevant, reliable, and timely information on the context within which market trading firms operate. The ESG-platform of the Bloomberg Terminals [BBT] includes more than 2,000 data fields that provide intel to aid in better understanding the “Stakeholder-impact” of the firm’s activities. One of the sub-platforms therein is the Institutional Shareholder Services [ISS] which offers Governance QualityScores: (GQSÔ). The BBT[ISS[GQS]]-platform is a data-driven approach to scoring & screening designed to help investors monitor a company’s control of governance risk. Previous studies have provided vetting information of the BBT[ISS[GQS]]-platform. As an enhancement to these vetting-studies, we offer the following. Study Design In the ESG-Platform, there are Disclosure Scores for: The General [ESG], Environment, Social & Governance categories. The vetting question of interest is: Does the ISS score those firms that provide more Disclosure information as ISS[1] and those firms that provide less as ISS[10]? If so, this would cast doubt on the relevance and reliability of the ISS-assignment taxonomy. Results We discuss the critical role of vetting. Then, the Dul: Necessity & Sufficiency Screen is offered as the organizing logic of the Inferential vetting platform. Finally, using the Gold Standard test: Linear Discriminant Analysis for the vetting inference, it is clear that the ISS-assignment is not aligned with the degree of provision of disclosure information for any of the four ESG-Disclosure Score variables. Thus, these vetting results are not inconsistent with a functioning taxonomic-allocation platform. 
35

Huang, Sun-Weng, James J. H. Liou, Shih-Hsiung Cheng, William Tang, Jessica C. Y. Ma, and Gwo-Hshiung Tzeng. "The Key Success Factors for Attracting Foreign Investment in the Post-Epidemic Era." Axioms 10, no. 3 (June 30, 2021): 140. http://dx.doi.org/10.3390/axioms10030140.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The global economy has been hit by the unexpected COVID-19 outbreak, and foreign investment has been seen as one of the most important tools to boost the economy. However, in the highly uncertain post-epidemic era, determining how to attract foreign investment is the key to revitalizing the economy. What are the important factors for governments to attract investment, and how to improve them? This will be an important decision in the post-epidemic era. Therefore, this study develops a novel decision-making model to explore the key factors in attracting foreign investment. The model first uses fuzzy Delphi to explore the key factors of attracting foreign investment in the post-epidemic era, and then uses DEMATEL to construct the causal relationships among these factors. To overcome the uncertainty of various information sources and inconsistent messages from decision-makers, this study combined neutrosophic set theory to conduct quantitative analysis. The results of the study show that the model is suitable for analyzing the key factors of investment attraction in the post-epidemic period. Based on the results of the study, we also propose strategies that will help the relevant policy-making departments to understand the root causes of the problem and to formulate appropriate investment strategies in advance. In addition, the model is also used for comparative analysis, which reveals that this novel approach can integrate more incomplete information and present expert opinions in a more objective way.
36

Dorr, David A., Christopher D'Autremont, Christie Pizzimenti, Nicole Weiskopf, Robert Rope, Steven Kassakian, Joshua E. Richardson, Rob McClure, and Floyd Eisenberg. "Assessing Data Adequacy for High Blood Pressure Clinical Decision Support: A Quantitative Analysis." Applied Clinical Informatics 12, no. 04 (August 2021): 710–20. http://dx.doi.org/10.1055/s-0041-1732401.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Objective This study examines guideline-based high blood pressure (HBP) and hypertension recommendations and evaluates the suitability and adequacy of the data and logic required for a Fast Healthcare Interoperable Resources (FHIR)-based, patient-facing clinical decision support (CDS) HBP application. HBP is a major predictor of adverse health events, including stroke, myocardial infarction, and kidney disease. Multiple guidelines recommend interventions to lower blood pressure, but implementation requires patient-centered approaches, including patient-facing CDS tools. Methods We defined concept sets needed to measure adherence to 71 recommendations drawn from eight HBP guidelines. We measured data quality for these concepts for two cohorts (HBP screening and HBP diagnosed) from electronic health record (EHR) data, including four use cases (screening, nonpharmacologic interventions, pharmacologic interventions, and adverse events) for CDS. Results We identified 102,443 people with diagnosed and 58,990 with undiagnosed HBP. We found that 21/35 (60%) of required concept sets were unused or inaccurate, with only 259 (25.3%) of 1,101 codes used. Use cases showed high inclusion (0.9–11.2%), low exclusion (0–0.1%), and missing patient-specific context (up to 65.6%), leading to data in 2/4 use cases being insufficient for accurate alerting. Discussion Data quality from the EHR required to implement recommendations for HBP is highly inconsistent, reflecting a fragmented health care system and incomplete implementation of standard terminologies and workflows. Although imperfect, data were deemed adequate for two test use cases. Conclusion Current data quality allows for further development of patient-facing FHIR HBP tools, but extensive validation and testing is required to assure precision and avoid unintended consequences.
37

Mohamad, Siti Nurul Fitriah, Roslan Hasni, Florentin Smarandache, and Binyamin Yusoff. "Novel Concept of Energy in Bipolar Single-Valued Neutrosophic Graphs with Applications." Axioms 10, no. 3 (July 29, 2021): 172. http://dx.doi.org/10.3390/axioms10030172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The energy of a graph is defined as the sum of the absolute values of its eigenvalues. Recently, there has been a lot of interest in graph energy research. Previous literature has suggested integrating energy, Laplacian energy, and signless Laplacian energy with single-valued neutrosophic graphs (SVNGs). This integration is used to solve problems that are characterized by indeterminate and inconsistent information. However, when the information is endowed with both positive and negative uncertainty, then bipolar single-valued neutrosophic sets (BSVNs) constitute an appropriate knowledge representation of this framework. A BSVNs is a generalized bipolar fuzzy structure that deals with positive and negative uncertainty in real-life problems with a larger domain. In contrast to the previous study, which directly used truth and indeterminate and false membership, this paper proposes integrating energy, Laplacian energy, and signless Laplacian energy with BSVNs to graph structure considering the positive and negative membership degree to greatly improve decisions in certain problems. Moreover, this paper intends to elaborate on characteristics of eigenvalues, upper and lower bound of energy, Laplacian energy, and signless Laplacian energy. We introduced the concept of a bipolar single-valued neutrosophic graph (BSVNG) for an energy graph and discussed its relevant ideas with the help of examples. Furthermore, the significance of using bipolar concepts over non-bipolar concepts is compared numerically. Finally, the application of energy, Laplacian energy, and signless Laplacian energy in BSVNG are demonstrated in selecting renewable energy sources, while optimal selection is suggested to illustrate the proposed method. This indicates the usefulness and practicality of this proposed approach in real life.
38

Danshyna, Svitlana, and Valeriy Cheranovskiy. "Formalizing the land inventory process for information support of land projects management." RADIOELECTRONIC AND COMPUTER SYSTEMS, no. 3 (October 4, 2022): 7–19. http://dx.doi.org/10.32620/reks.2022.3.01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The subject of study in this work is the land inventory process. The work increases the efficiency of the land inventory process by finding the possibility of reducing the amount of topographic surveying work by algorithmizing and systematizing the information flows with a combination of qualitatively different data. Objectives: to analyze the land inventory process within land management projects to identify possible ways of its efficiency improvement; focusing on information flows, to create an information model for the inventory process; to develop scientific-methodological basis for information support of land management projects to reduce the amount of topographic surveying for implementation into decision support systems for land management. The following results were obtained. The requirements of the current legislation on the inventory of land plots and related tasks are generalized. A set-theoretic model of information flows of the inventory process, which combines the approaches of functional modeling, is proposed. It allows us to combine the qualitatively different data, consider their dynamic nature and the logic of interaction. IDEF3-model developed. This model considers remote sensing data as a source of accurate and up-to-date information, algorithmizes the mechanism of their combination with other information about land plots and explains the dynamic nature of its changes with time. A method for creating a database of the working inventory land plan is proposed. It combines information about land plots from several sources, reduces the amount of topographic surveying by selecting plots that needed to be coordinated (determined) with the geospatial data. The developed scientific-methodological basis of information support for land projects forms the structure of information technology for land inventory. Its usage will reduce the number of resources required for the implementation of land projects, at the same time forming reliable conclusions about the state of the land by using geographic information systems (GIS) and combining dissimilar data. Conclusions. The results of the bibliographic search confirmed the following. The effectiveness of the land inventory process, which ensures compliance with legislation in the field of land management, is very difficult because of the need to analyze a large amount of different information appears, which may contain errors, inconsistent and contradictory data. This requires the development of specialized models and method focused on the use of GIS for their implementation into decision support systems for land management. Scientific-methodological basis of information support for land projects during the inventory of land plots has been developed, practical usage of which confirmed that the time costs for obtaining the land plot research data decreased by almost 33% and the accuracy of measuring geometric dimensions of the land plot increased by about 1%.
39

Sahu, Santosh Kumar, Saurav Datta, and Siba Sankar Mahapatra. "Appraisement and benchmarking of supply chain performance extent." Grey Systems: Theory and Application 5, no. 1 (February 2, 2015): 2–30. http://dx.doi.org/10.1108/gs-10-2014-0036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Purpose – Supply chain performance (SCP) extent can be attributed as a function of multiple criteria/attributes. Most of the criterions/attributes being intangible in nature; SCP appraisement relies on the subjective judgment of the decision makers. Moreover, quantitative appraisement of SCP appears to be very difficult due to involvement of ill-defined (vague) performance measures as well as metrics. The purpose of this paper is to develop an efficient decision support system (DSS) to facilitate SCP appraisement, benchmarking and related decision making. Design/methodology/approach – This study explores the concept of fuzzy logic in order to tackle incomplete and inconsistent subjective judgment of the decision makers’ whilst evaluating supply chain’s overall performance. Grey relational analysis has been adopted in the later stage to derive appropriate ranking of alternative companies/enterprises (in the same industry) in view of ongoing SCP extent. Findings – In this work, a performance appraisement index system has been postulated to gather evaluation information (weights and ratings) in relation to SCP measures and metrics. Combining the concepts of fuzzy set theory, entropy, ideal and grey relation analysis, a fuzzy grey relation method for SCP benchmarking problem has been presented. First, triangular fuzzy numbers and linguistic evaluation information characterized by triangular fuzzy numbers have been used to evaluate the importance weights of all criteria and the superiority of all alternatives vs various criteria above the alternative level. Then, the concept of entropy has been utilized to solve the adjusted integration weight of all objective criteria above the alternative level. Moreover, using the concept of the grey ration grades, various alternatives have been ranked accordingly. Originality/value – Finally, an empirical example of selecting most appropriate company has been used to demonstrate the ease of applicability of the aforesaid approach. The study results showed that this method appears to be an effective means for tackling multi-criteria decision-making problems in uncertain environments. Empirical data have been analysed and results obtained thereof, have been reported to exhibit application potential of the said fuzzy grey relation based DSS in appropriate situation.
40

Salway, Sarah, Elizabeth Such, Louise Preston, Andrew Booth, Maria Zubair, Christina Victor, and Raghu Raghavan. "Reducing loneliness among migrant and ethnic minority people: a participatory evidence synthesis." Public Health Research 8, no. 10 (July 2020): 1–246. http://dx.doi.org/10.3310/phr08100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Background To date, there has been little research into the causes of, and solutions to, loneliness among migrant and ethnic minority people. Objectives The objectives were to synthesise available evidence and produce new insights relating to initiatives that aim to address loneliness among these populations, plus the logic, functioning and effects of such initiatives. Data sources Electronic database searches (MEDLINE, Applied Social Sciences Index and Abstracts and Social Science Citation Index via Web of Science – no date restrictions were applied), grey literature searches, and citation and reference searching were conducted. Data were generated via nine workshops with three consultation panels involving 34 public contributors, and one practitioner workshop involving 50 participants. Review methods Guided by ‘systems thinking’, a theory-driven synthesis was combined with an effectiveness review to integrate evidence on the nature and causes of loneliness, interventional types and programme theory, and intervention implementation and effectiveness. Results The theory review indicated that common conceptualisations of ‘loneliness’ can be usefully extended to recognise four proximate determinants when focusing on migrant and ethnic minority populations: positive social ties and interactions, negative social ties and interactions, self-worth, and appraisal of existing ties. A total of 170 interventions were included. A typology of eight interventions was developed. Detailed logic models were developed for three common types of intervention: befriending, shared-identity social support groups and intercultural encounters. The models for the first two types were generally well supported by empirical data; the third was more tentative. Evaluation of intervention processes and outcomes was limited by study content and quality. Evidence from 19 qualitative and six quantitative studies suggested that social support groups have a positive impact on dimensions of loneliness for participants. Evidence from nine qualitative and three quantitative studies suggested that befriending can have positive impacts on loneliness. However, inconsistent achievements of the befriending model meant that some initiatives were ineffective. Few studies on intercultural encounters reported relevant outcomes, although four provided some qualitative evidence and three provided quantitative evidence of improvement. Looking across intervention types, evidence suggests that initiatives targeting the proximate determinants – particularly boosting self-worth – are more effective than those that do not. No evidence was available on the long-term effects of any initiatives. UK intervention (n = 41) and non-intervention (n = 65) studies, together with consultation panel workshop data, contributed to a narrative synthesis of system processes. Interlocking factors operating at individual, family, community, organisational and wider societal levels increase risk of loneliness, and undermine access to, and the impact of, interventions. Racism operates in various ways throughout the system to increase risk of loneliness. Limitations There was a lack of high-quality quantitative studies, and there were no studies with longer-term follow-up. UK evidence was very limited. Studies addressing upstream determinants operating at the community and societal levels did not link through to individual outcome measures. Some elements of the search approach may mean that relevant literature was overlooked. Conclusions Theory regarding the causes of loneliness, and functioning of interventions, among migrant and ethnic minority populations was usefully developed. Evidence of positive impact on loneliness was strongest for shared-identity social support groups. Quantitative evidence was inadequate. The UK evidence base was extremely limited. Future work UK research in this area is desperately needed. Co-production of interventional approaches with migrant and ethnic minority people and evaluation of existing community-based initiatives are priorities. Study registration This study is registered as PROSPERO CRD42017077378. Funding This project was funded by the National Institute for Health Research Public Health Research programme and will be published in full in Public Health Research; Vol. 8, No. 10. See the NIHR Journals Library website for further project information.
41

Van Wyk, Susanna S., Marriott Nliwasa, James A. Seddon, Graeme Hoddinott, Lario Viljoen, Emmanuel Nepolo, Gunar Günther, et al. "Case-Finding Strategies for Drug-Resistant Tuberculosis: Protocol for a Scoping Review." JMIR Research Protocols 11, no. 12 (December 15, 2022): e40009. http://dx.doi.org/10.2196/40009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Background Transmission of drug-resistant tuberculosis (DR-TB) is ongoing. Finding individuals with DR-TB and initiating treatment as early as possible is important to improve patient clinical outcomes and to break the chain of transmission to control the pandemic. To our knowledge systematic reviews assessing effectiveness, cost-effectiveness, acceptability, and feasibility of different case-finding strategies for DR-TB to inform research, policy, and practice have not been conducted, and it is unknown whether enough research exists to conduct such reviews. It is unknown whether case-finding strategies are similar for DR-TB and drug-susceptible TB and whether we can draw on findings from drug-susceptible reviews to inform decisions on case-finding strategies for DR-TB. Objective This protocol aims to describe the available literature on case-finding for DR-TB and to describe case-finding strategies. Methods We will screen systematic reviews, trials, qualitative studies, diagnostic test accuracy studies, and other primary research that specifically sought to improve DR-TB case detection. We will exclude studies that invited individuals seeking care for TB symptoms, those including individuals already diagnosed with TB, or laboratory-based studies. We will search the academic databases including MEDLINE, Embase, The Cochrane Library, Africa-Wide Information, CINAHL, Epistemonikos, and PROSPERO with no language or date restrictions. We will screen titles, abstracts, and full-text articles in duplicate. Data extraction and analyses will be performed using Excel (Microsoft Corp). Results We will provide a narrative report with supporting figures or tables to summarize the data. A systems-based logic model, developed from a synthesis of case-finding strategies for drug-susceptible TB, will be used as a framework to describe different strategies, resulting pathways, and enhancements of pathways. The search will be conducted at the end of 2021. Title and abstract screening, full text screening, and data extraction will be undertaken from January to June 2022. Thereafter, analysis will be conducted, and results compiled. Conclusions This scoping review will chart existing literature on case-finding for DR-TB—this will help determine whether primary studies on effectiveness, cost-effectiveness, acceptability, and feasibility of different case-finding strategies for DR-TB exist and will help formulate potential questions for a systematic review. We will also describe case-finding strategies for DR-TB and how they fit into a model of case-finding pathways for drug-susceptible TB. This review has some limitations. One limitation is the diverse, inconsistent use of intervention terminology within the literature, which may result in missing relevant studies. Poor reporting of intervention strategies may also cause misunderstanding and misclassification of interventions. Lastly, case-finding strategies for DR-TB may not fit into a model developed from strategies for drug-susceptible TB. Nevertheless, such a situation will provide an opportunity to refine the model for future research. The review will guide further research to inform decisions on case-finding policies and practices for DR-TB. International Registered Report Identifier (IRRID) DERR1-10.2196/40009
42

Korn, Janos. "Crisis in systems thinking." Kybernetes 49, no. 7 (July 19, 2019): 1915–34. http://dx.doi.org/10.1108/k-01-2019-0026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Purpose The current field of systems thinking consists of a variety of views, methods and a number of organisations involved with these views which suggests a state of confusion and fragmentation of the field which fundamentally is supposed to be a uniform view of structures or systems. This can be interpreted as a “crisis situation”. A resolution of the crisis in the form of a “new science of systems” is proposed. Assuming this new science becomes part of the field of systems thinking, a debate of the elements of the field is suggested with a view to consider its current state and future developments. “Crisis - resolution - debate” is the central theme of the paper. Design/methodology/approach The field of current systems thinking is described in terms of views, methods and organisations and is seen as the “problematic issue”. A “new science of systems” strongly rooted in natural language as its primary symbolism and consisting of three general principles of systems and linguistic modelling is outlined to be considered as the resolution of the crisis. A set of criteria is discussed for use of judging the quality of models and element of the field of systems thinking including the “new science of systems”. To demonstrate a preliminary use of these criteria, the same example is worked out using both, the “soft systems methodology” and “linguistic modelling” for comparison. Findings The universal view of parts of the world as structures or systems is inconsistent with the multiple methods basically pursuing the same purpose: modelling aspects of systems which prevail in current systems thinking. To try to resolve this anomaly an equally universally applicable approach, the “new science of systems” is proposed which can also serve as an aid to problem solving, in particular to an integrated systems and product design. This approach is to be part of the suggested debate of the field of systems thinking. In general, there is no alternative to the structural view. Research limitations/implications The “new science of systems”, if found acceptable, can offer research opportunities in new applications of accepted branches of knowledge like logic, linguistics, mathematics of ordered pairs, uncertainties and in the philosophy of science. New teaching schemes can be developed at classroom level combined with engineering as creator of novelties with linguistics as the symbolism to supplement mathematics. Further considerations can be given to current methodologies of systems thinking as part of a debate with a view of future developments in exploring pioneering ideas. New software is needed for working out the dynamics of scenarios. Practical implications The debate, if it takes place, should result in new developments in the field of systems thinking such as concepts accepted as fundamental in the discipline of systems. Applications of the “new science of systems” to larger scale scenarios and organisations guided by the universal scheme in Figure 1 and linguistic modelling with software are needed for development of schemes for problem solving schemes “utilising” or “producing” products. Social implications The “new science of systems” is rooted in accepted branches of knowledge; it is highly teachable at school and university levels and should lead to use by professionals and in everyday life activities once found acceptable. The use of the scheme in Figure 1 should help in clarifying confusing scenarios and to aid problem solving. Originality/value The suggestion of a debate is an original idea. The “new science of systems” consists of three general principles of systems implemented by linguistic modelling of static and dynamic states. Mathematics of uncertainty and topics from conventional science at the object level supplement the “new science” which together form the “scientific enterprise”. The notions of cognitive value and informative content of models are introduced for evaluating their cognitive worth.
43

Rudas, Imre J. "Intelligent Engineering Systems." Journal of Advanced Computational Intelligence and Intelligent Informatics 4, no. 4 (July 20, 2000): 237–39. http://dx.doi.org/10.20965/jaciii.2000.p0237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The "information revolution" of our time affects our entire generation. While a vision of the "Information Society," with its financial, legal, business, privacy, and other aspects has emerged in the past few years, the "traditional scene" of information technology, that is, industrial automation, maintained its significance as a field of unceasing development. Since the old-fashioned concept of "Hard Automation" applicable only to industrial processes of fixed, repetitive nature and manufacturing large batches of the same product1)was thrust to the background by keen market competition, the key element of this development remained the improvement of "Machine Intelligence". In spite of the fact that L. A. Zadeh already introduced the concept of "Machine Intelligence Quotient" in 1996 to measure machine intelligence2) , this term remained more or less of a mysterious meaning best explicable on the basis of practical needs. The weak point of hard automation is that the system configuration and operations are fixed and cannot be changed without incurring considerable cost and downtime. Mainly it can be used in applications that call for fast and accurate operation in large batch production. Whenever a variety of products must be manufactured in small batches and consequently the work-cells of a production line should be quickly reconfigured to accommodate a change in products, hard automation becomes inefficient and fails due to economic reasons. In these cases, new, more flexible way of automation, so-called "Soft Automation," are expedient and suitable. The most important "ingredient" of soft automation is its adaptive ability for efficiently coping with changing, unexpected or previously unknown conditions, and working with a high degree of uncertainty and imprecision since in practice increasing precision can be very costly. This adaptation must be realized without or within limited human interference: this is one essential component of machine intelligence. Another important factor is that engineering practice often must deal with complex systems of multiple variable and multiple parameter models almost always with strong nonlinear coupling. Conventional analysis-based approaches for describing and predicting the behavior of such systems in many cases are doomed to failure from the outset, even in the phase of the construction of a more or less appropriate mathematical model. These approaches normally are too categorical in the sense that in the name of "modeling accuracy," they try to describe all structural details of the real physical system to be modeled. This significantly increases the intricacy of the model and may result in huge computational burden without considerably improving precision. The best paradigm exemplifying this situation may be the classic perturbation theory: the less significant the achievable correction is, the more work must be invested for obtaining it. Another important component of machine intelligence is a kind of "structural uniformity" giving room and possibility to model arbitrary particular details a priori not specified and unknown. This idea is similar to that of the ready-to-wear industry, whose products can later be slightly modified in contrast to the custom-tailors' made-to-measure creations aiming at maximum accuracy from the beginning. Machines carry out these later corrections automatically. This "learning ability" is another key element of machine intelligence. To realize the above philosophy in a mathematically correct way, L. A. Zadeh separated Hard Computing from Soft Computing. This revelation immediately resulted in distinguishing between two essential complementary branches of machine intelligence: Hard Computing based Artificial Intelligence and Soft Computing based Computational Intelligence. In the last decades, it became generally known that fuzzy logic, artificial neural networks, and probabilistic reasoning based Soft Computing is a fruitful orientation in designing intelligent systems. Moreover, it became generally accepted that soft computing rather than hard computing should be viewed as the foundation of real machine intelligence via exploiting the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality. Further research in the past decade confirmed the view that typical components of present soft computing such as fuzzy logic, neurocomputing, evolutionary computation and probabilistic reasoning are complementary and best results can be obtained by their combined application. These complementary branches of Machine Intelligence, Artificial Intelligence and Computational Intelligence, serve as the basis of Intelligent Engineering Systems. The huge number of scientific results published in journals and conference proceedings worldwide substantiates this statement. Three years ago, a new series of conferences in this direction was initiated and launched with the support of several organizations including the IEEE Industrial Electronics Society and IEEE Hungary Section in technical cooperation with IEEE Robotics & Automation Society. The first event of the series hosted by Bdnki Dondt Polytechnic, Budapest, Hungary, was called "19997 IEEE International Conference on Intelligent Engineering Systems " (INES'97). The Technical University of Vienna, Austria hosted the next event of the series in 1998, followed by INES'99 held by the Technical University of Kosice, Slovakia. The present special issue consists of the extended and revised version of the most interesting papers selected out of the presentations of this conference. The papers exemplify recent development trends of intelligent engineering systems. The first paper pertains to the wider class of neural network applications. It is an interesting report of applying a special Adaptive Resonance Theory network for identifying objects in multispectral images. It is called "Extended Gaussian ARTMAP". The authors conclude that this network is especially advantageous for classification of large, low dimensional data sets. The second paper's subject belongs to the realm of fuzzy systems. It reports successful application of fundamental similarity relations in diagnostic systems. As an example failure detection of rolling-mill transmission is considered. The next paper represents the AI-branch of machine intelligence. The paper is a report on an EU-funded project focusing on the storage of knowledge in a corporate organizational memory used for storing and retrieving knowledge chunks for it. The flexible structure of the system makes it possible to adopt it to different SMEs via using company-specific conceptual terms rather than traditional keywords. The fourth selected paper's contribution is to the field of knowledge discovery. For this purpose in the first step, cluster analysis is done. The method is found to be helpful whenever little or no information on the characteristics of a given data set is available. The next paper approaches scheduling problems by the application of the multiagent system. It is concluded that due to the great number of interactions between components, MAS seems to be well suited for manufacturing scheduling problems. The sixth selected paper's topic is emerging intelligent technologies in computer-aided engineering. It discusses key issues of CAD/CAM technology of our days. The conclusion is that further development of CAD/CAM methods probably will serve companies on the competitive edge. The seventh paper of the selection is a report on seeking a special tradeoff between classical analytical modeling and traditional soft computing. It nonconventionally integrates uniform structures obtained from Lagrangian Classical Mechanics with other simple elements of machine intelligence such as saturated sigmoid transition functions borrowed from neural nets, and fuzzy rules with classical PID/ST, and a simplified version of regression analysis. It is concluded that these different components can successfully cooperate in adaptive robot control. The last paper focuses on the complexity problem of fuzzy and neural network approaches. A fuzzy rule base, be it generated from expert operators or by some learning or identification schemes, may contain redundant, weakly contributing, or outright inconsistent components. Moreover, in pursuit of good approximation, one may be tempted to overly assign the number of antecedent sets, thereby resulting in large fuzzy rule bases and much problems in computation time and storage space. Engineers using neural networks have to face the same complexity problem with the number of neurons and layers. A fuzzy rule base and neural network design, hence, have two important objectives. One is to achieve a good approximation. The other is to reduce the complexity. The main difficulty is that these two objectives are contradictory. A formal approach to extracting the more pertinent elements of a given rule set or neurons is, hence, highly desirable. The last paper is an attempt in this direction. References 1)C. W. De Silva. Automation Intelligence. Engineering Application of Artificial Intelligence. Vol. 7. No. 5. 471-477 (1994). 2)L. A. Zadeh. Fuzzy Logic, Neural Networks and Soft Computing. NATO Advanced Studies Institute on Soft Computing and Its Application. Antalya, Turkey. (1996). 3)L. A. Zadeh. Berkeley Initiative in Soft Computing. IEEE Industrial Electronics Society Newsletter. 41, (3), 8-10 (1994).
44

Caret, Colin R. "In Pursuit of the Non-Trivial." Episteme, June 20, 2019, 1–16. http://dx.doi.org/10.1017/epi.2019.17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractThis paper is about the underlying logical principles of scientific theories. In particular, it concerns ex contradictione quodlibet (ECQ) the principle that anything follows from a contradiction. ECQ is valid according to classical logic, but invalid according to paraconsistent logics. Some advocates of paraconsistency claim that there are ‘real’ inconsistent theories that do not erupt with completely indiscriminate, absurd commitments. They take this as evidence in favor of paraconsistency. Michael (2016) calls this the non-triviality strategy (NTS). He argues that this strategy fails in its purpose. I will show that Michael's criticism significantly over-reaches. The fundamental problem is that he places more of a burden on the advocate of paraconsistency than on the advocate of classical logic. The weaknesses in Michael's argument are symptomatic of this preferential treatment of one viewpoint in the debate over another. He does, however, make important observations that allow us to clarify some of the complexities involved in giving a logical reconstruction of a theory. I will argue that there are abductive arguments deserving of further consideration for the claim that paraconsistent logic offers the best explanation of the practice of inconsistent science. In this sense, the debate is still very much open.
45

Du, Jianfeng, Kewen Wang, and Yi-Dong Shen. "Towards Tractable and Practical ABox Abduction over Inconsistent Description Logic Ontologies." Proceedings of the AAAI Conference on Artificial Intelligence 29, no. 1 (February 18, 2015). http://dx.doi.org/10.1609/aaai.v29i1.9393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
ABox abduction plays an important role in reasoning over description logic (DL) ontologies. However, it does not work with inconsistent DL ontologies. To tackle this problem while achieving tractability, we generalize ABox abduction from the classical semantics to an inconsistency-tolerant semantics, namely the Intersection ABox Repair (IAR) semantics, and propose the notion of IAR-explanations in inconsistent DL ontologies. We show that computing all minimal IAR-explanations is tractable in data complexity for first-order rewritable ontologies. However, the computational method may still not be practical due to a possibly large number of minimal IAR-explanations. Hence we propose to use preference information to reduce the number of explanations to be computed.
46

Bozdag, Sena. "A Semantics for Hyperintensional Belief Revision Based on Information Bases." Studia Logica, December 10, 2021. http://dx.doi.org/10.1007/s11225-021-09973-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractI propose a novel hyperintensional semantics for belief revision and a corresponding system of dynamic doxastic logic. The main goal of the framework is to reduce some of the idealisations that are common in the belief revision literature and in dynamic epistemic logic. The models of the new framework are primarily based on potentially incomplete or inconsistent collections of information, represented by situations in a situation space. I propose that by shifting the representational focus of doxastic models from belief sets to collections of information, and by defining changes of beliefs as artifacts of changes of information, we can achieve a more realistic account of belief representation and belief change. The proposed dynamic operation suggests a non-classical way of changing beliefs: belief revision occurs in non-explosive environments which allow for a non-monotonic and hyperintensional belief dynamics. A logic that is sound with respect to the semantics is also provided.
47

Akhtar, Salwa Muhammad, Makia Nazir, Kiran Saleem, Rana Zeeshan Ahmad, Abdul Rehman Javed, Shahab S. Band, and Amir Mosavi. "A Multi-Agent Formalism Based on Contextual Defeasible Logic for Healthcare Systems." Frontiers in Public Health 10 (March 3, 2022). http://dx.doi.org/10.3389/fpubh.2022.849185.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In the last decade, smart computing has garnered much attention, particularly in ubiquitous environments, thus increasing the ease of everyday human life. Users can dynamically interact with the systems using different modalities in a smart computing environment. The literature discussed multiple mechanisms to enhance the modalities for communication using different knowledge sources. Among others, Multi-context System (MCS) has been proven quite significant to interlink various context domains dynamically to a distributed environment. MCS is a collection of different contexts (independent knowledge sources), and every context contains its own set of defined rules and facts and inference systems. These contexts are interlinked via bridge rules. However, the interaction among knowledge sources could have the consequences such as bringing out inconsistent results. These issues may report situations such as the system being unable to reach a conclusion or communication in different contexts becoming asynchronous. There is a need for a suitable framework to resolve inconsistencies. In this article, we provide a framework based on contextual defeasible reasoning and a formalism of multi-agent environment is to handle the issue of inconsistent information in MCS. Additionally, in this work, a prototypal simulation is designed using a simulation tool called NetLogo, and a formalism about a Parkinson's disease patient's case study is also developed. Both of these show the validity of the framework.
48

Kamil, Mohammad Zaid, Faisal Khan, Guozheng Song, and Salim Ahmed. "Dynamic Risk Analysis Using Imprecise and Incomplete Information." ASCE-ASME J Risk and Uncert in Engrg Sys Part B Mech Engrg 5, no. 4 (September 25, 2019). http://dx.doi.org/10.1115/1.4044042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Accident modeling is a vital step, which helps in designing preventive measures to avoid future accidents, and thus, to enhance process safety. Bayesian networks (BN) are widely used in accident modeling due to its capability to represent accident scenarios from their causes to likely consequences. However, to assess likelihood of an accident using the BN, it requires exact basic event probabilities, which are often obtained from expert opinions. Such subjective opinions are often inconsistent and sometimes conflicting and/or incomplete. In this work, evidence theory has been coupled with BN to address inconsistency, conflict and incompleteness in the expert opinions. It combines the acquired knowledge from various subjective sources, thereby rendering accuracy in probability estimation. Another source of uncertainty in BN is model uncertainty. To represent multiple interactions of a cause–effect relationship Noisy-OR and leaky Noisy-AND gates are explored in the study. Conventional logic gates, i.e., OR/AND gates can only provide a linear interaction of cause–effect relationship hence introduces uncertainty in the assessment. The proposed methodology provides an impression how dynamic risk assessment could be conducted when the sufficient information about a process system is unavailable. To illustrate the execution of a proposed methodology, a tank equipped with a basic process control system has been used as an example. A real-life case study has also been used to validate the proposed model and compare its results with those using a deterministic approach.
49

Elgeshy, Khaled M., and Abdel Hady A. Abdel Wahab. "Role, Significance and Association of microRNA-10a/b in Physiology of Cancer." MicroRNA 09 (October 26, 2020). http://dx.doi.org/10.2174/2211536609666201026155519.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
: MicroRNAs “miRNAs” are small non-coding RNAs that regulate translation of mRNA and protein mainly on post-transcriptional level. And as global expression profiling of miRNAs has demonstrated a wide spectrum of aberration that are correlated with several diseases, miRNA-10a/b were of the first miRNAs to be examined to be involved in abnormal activities upon dysregulation, including many types of cancers and progressive diseases. It is expected for the same miRNAs to behave in an inconsistent fashion within different types of cancers, or even in the same type upon different contexts or phases. This review is an attempt to provide a set of information about our updated understanding of miRNA-10a/b and their clinical significance, molecular targets, current research gaps and possibly future applications of such potent regulators, and help to uncover the logic behind such behavior and possible approaches to exploit such unique entities.
50

"Identification of Safe Assembly Points in Emergencies in a Gas Refinery of the South Pars Gas Complex Using Fuzzy Logic Model." Journal of Rescue and Relief, May 27, 2019, 275–86. http://dx.doi.org/10.32592/jorar.2019.11.4.6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
INTRODUCTION: Crisis management is of critical importance in the oil and gas industries due to the increasing occurrence of accidents in these areas. One of the most important issues regarding crisis management in such industries is the identification of safety assembly points where employees should gather in emergencies. This study aimed to identify the safe points in a refinery using geographic information system (GIS) and fuzzy logic for emergency assembly. METHODS: Regarding the aim of the study purpose, the required data were collected, and a focus group meeting was held with experts to determine the criteria influencing the safety point zoning as well as high-risk units using the HAZOP method. After the identification of the criteria and sub-criteria affecting the zoning, the weight of each zoning parameter was calculated, and the safety zones were determined using the fuzzy logic model and its operators in the GIS environment. FINDINGS: According to the results of the risk assessment, the criteria and sub-criteria affecting zoning were divided into three categories of inconsistent (layer weight: 0.740), consistent (layer weight: 0.094), and access to exit routes (layer weight: 0.167). Moreover, the map results based on the fuzzy logic model revealed three safe points, including the vicinity of the fire station, clinic, and wastewater treatment plant in this refinery where the employees should gather in the event of emergencies. CONCLUSION: The results of this study showed that the selection of appropriate criteria in safe point zoning is of great importance in the emergencies in the industries. Moreover, an initial risk assessment can be effective in determining these criteria and sub-criteria. In addition, the fuzzy logic model has high accuracy and precision in determining the appropriate safe places.

To the bibliography