To see the other types of publications on this topic, follow the link: Mathematical Logic and Formal Languages.

Journal articles on the topic 'Mathematical Logic and Formal Languages'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Mathematical Logic and Formal Languages.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Gopal, Тadepalli. "Learning Computational Logic through Geometric Reasoning." Innovative STEM Education 5, no. 1 (July 24, 2023): 7–12. http://dx.doi.org/10.55630/stem.2023.0501.

Full text
Abstract:
Computers control everyday things ranging from the heart pacemakers to voice controlled devices that form an integral part of many appliances. Failures related to computers regularly cause disruption, damage and occasionally death. Computational logic establishes the facts in a logical formalism. It attempts to understand the nature of mathematical reasoning with a wide variety of formalisms, techniques and technologies. Formal verification uses mathematical and logical formalisms to prove the correctness of designs. Formal methods provide the maturity and agility to assimilate the future concepts, languages, techniques and tools for computational methods and models. The quest for simplification of formal verification is never ending. This summary report advocates the use of geometry to construct quick conclusions by the human mind that can be formally verified if necessary.
APA, Harvard, Vancouver, ISO, and other styles
2

Park, Sewon. "Continuous Abstract Data Types for Verified Computation." Bulletin of Symbolic Logic 27, no. 4 (December 2021): 531. http://dx.doi.org/10.1017/bsl.2021.51.

Full text
Abstract:
AbstractWe devise imperative programming languages for verified real number computation where real numbers are provided as abstract data types such that the users of the languages can express real number computation by considering real numbers as abstract mathematical entities. Unlike other common approaches toward real number computation, based on an algebraic model that lacks implementability or transcendental computation, or finite-precision approximation such as using double precision computation that lacks a formal foundation, our languages are devised based on computable analysis, a foundation of rigorous computation over continuous data. Consequently, the users of the language can easily program real number computation and reason about the behaviours of their programs, relying on their mathematical knowledge of real numbers without worrying about artificial roundoff errors. As the languages are imperative, we adopt precondition–postcondition-style program specification and Hoare-style program verification methodologies. Consequently, the users of the language can easily program a computation over real numbers, specify the expected behaviour of the program, including termination, and prove the correctness of the specification. Furthermore, we suggest extending the languages with other interesting continuous data, such as matrices, continuous real functions, et cetera.Abstract taken directly from the thesis.E-mail: sewonpark17@gmail.comURL: https://sewonpark.com/thesis
APA, Harvard, Vancouver, ISO, and other styles
3

Moschovakis, Yiannis N. "The formal language of recursion." Journal of Symbolic Logic 54, no. 4 (December 1989): 1216–52. http://dx.doi.org/10.1017/s0022481200041086.

Full text
Abstract:
This is the first of a sequence of papers in which we will develop a foundation for the theory of computation based on a precise, mathematical notion of abstract algorithm. To understand the aim of this program, one should keep in mind clearly the distinction between an algorithm and the object (typically a function) computed by that algorithm. The theory of computable functions (on the integers and on abstract structures) is obviously relevant to this work, but we will focus on making rigorous and identifying the mathematical properties of the finer (intensional) notion of algorithm.It is characteristic of this approach that we take recursion to be a fundamental (primitive) process for constructing algorithms, not a derived notion which must be reduced to others—e.g. iteration or application and abstraction, as in the classical λ-calculus. We will model algorithms by recursors, the set-theoretic objects one would naturally choose to represent (syntactically described) recursive definitions. Explicit and iterative algorithms are modelled by (appropriately degenerate) recursors.The main technical tool we will use is the formal language of recursion, FLR, a language of terms with two kinds of semantics: on each suitable structure, the denotation of a term t of FLR is a function, while the intension of t is a recursor (i.e. an algorithm) which computes the denotation of t. FLR is meant to be intensionally complete, in the sense that every (intuitively understood) “algorithm” should “be” (faithfully modelled, in all its essential properties by) the intension of some term of FLR on a suitably chosen structure.
APA, Harvard, Vancouver, ISO, and other styles
4

Gelsema, Tjalling. "The Logic of Aggregated Data." Acta Cybernetica 24, no. 2 (November 3, 2019): 211–48. http://dx.doi.org/10.14232/actacyb.24.2.2019.4.

Full text
Abstract:
A notion of generalization-specialization is introduced that is more expressive than the usual notion from, e.g., the UML or RDF-based languages. This notion is incorporated in a typed formal language for modeling aggregated data. Soundness with respect to a sets-and-functions semantics is shown subsequently. Finally, a notion of congruence is introduced. With it terms in the language that have identical semantics, i.e., synonyms, can be discovered. The resulting formal language is well-suited for capturing faithfully aggregated data in such a way that it can serve as the foundation for corporate metadata management in a statistical office.
APA, Harvard, Vancouver, ISO, and other styles
5

Kutsak, Nina Yu, and Vladislav V. Podymov. "Formal Verification of Three-Valued Digital Waveforms." Modeling and Analysis of Information Systems 26, no. 3 (September 28, 2019): 332–50. http://dx.doi.org/10.18255/1818-1015-2019-3-332-350.

Full text
Abstract:
We investigate a formal verification problem (mathematically rigorous correctness checking) for digital waveforms used in practical development of digital microelectronic devices (digital circuits) at early design stages. According to modern methodologies, a digital circuit design starts at high abstraction levels provided by hardware description languages (HDLs). One of essential steps of an HDLbased circuit design is an HDL code debug, similar to the same step of program development in means and importance. A popular way of an HDL code debug is based on extraction and analysis of a waveform, which is a collection of plots for digital signals: functional descriptions of value changes related to selected circuit places in real time. We propose mathematical means for automation of correctness checking for such waveforms based on notions and methods of formal verification against temporal logic formulae, and focus on such typical featues of HDL-related digital signals and corresponding (informal) properties, such as real time, three-valuededness, and presence of signal edges. The three-valuededness means that at any given time, besides basic logical values 0 and 1, a signal may have a special undefined value: one of the values 0 and 1, but which one of them is either not known, or not important. An edge point of a signal is a time point at which the signal changes its value. The main results are mathematical notions, propositions, and algorithms which allow to formalize and solve a formal verification problem for considered waveforms, including: definitions for signals and waveforms which the mentioned typical digital signal features; a temporal logic suitable for formalization of waveform correctness properties, and a related verification problem statement; a solution technique for the verification problem, which is based on reduction to signal transfromation and analysis; a corresponding verification algorithm together with its correctness proof and “reasonable” complexity bounds.
APA, Harvard, Vancouver, ISO, and other styles
6

Vanderveken, Daniel. "Towards a Formal Pragmatics of Discourse." International Review of Pragmatics 5, no. 1 (2013): 34–69. http://dx.doi.org/10.1163/18773109-13050102.

Full text
Abstract:
Could we enrich speech-act theory to deal with discourse? Wittgenstein and Searle pointed out difficulties. Most conversations lack a conversational purpose, they require collective intentionality, their background is indefinitely open, irrelevant and infelicitous utterances do not prevent conversations to continue, etc. Like Wittgenstein and Searle I am sceptic about the possibility of a general theory of all kinds of language-games. In my view, the single primary purpose of discourse pragmatics is to analyse the structure and dynamics of language-games whose type is provided with an internal conversational goal. Such games are indispensable to any kind of discourse. They have a descriptive, deliberative, declaratory or expressive conversational goal corresponding to a possible direction of fit between words and things. Logic can analyse felicity-conditions of such language-games because they are conducted according to systems of constitutive rules. Speakers often speak non-literally or non-seriously. The real units of conversation are therefore attempted illocutions whether literal, serious or not. I will show how to construct speaker-meaning from sentence-meaning, conversational background and conversational maxims. I agree with Montague that we need the resources of formalisms (proof, model- and game-theories) and of mathematical and philosophical logic in pragmatics. I will explain how to further develop propositional and illocutionary logics, the logic of attitudes and of action in order to characterize our ability to converse. I will also compare my approach to others (Austin, Belnap, Grice, Montague, Searle, Sperber and Wilson, Kamp, Wittgenstein) as regards hypotheses, methodology and other issues.
APA, Harvard, Vancouver, ISO, and other styles
7

Kanamori, Akihiro. "The Empty Set, The Singleton, and the Ordered Pair." Bulletin of Symbolic Logic 9, no. 3 (September 2003): 273–98. http://dx.doi.org/10.2178/bsl/1058448674.

Full text
Abstract:
For the modern set theorist the empty set Ø, the singleton {a}, and the ordered pair 〈x, y〉 are at the beginning of the systematic, axiomatic development of set theory, both as a field of mathematics and as a unifying framework for ongoing mathematics. These notions are the simplest building locks in the abstract, generative conception of sets advanced by the initial axiomatization of Ernst Zermelo [1908a] and are quickly assimilated long before the complexities of Power Set, Replacement, and Choice are broached in the formal elaboration of the ‘set of’f {} operation. So it is surprising that, while these notions are unproblematic today, they were once sources of considerable concern and confusion among leading pioneers of mathematical logic like Frege, Russell, Dedekind, and Peano. In the development of modern mathematical logic out of the turbulence of 19th century logic, the emergence of the empty set, the singleton, and the ordered pair as clear and elementary set-theoretic concepts serves as amotif that reflects and illuminates larger and more significant developments in mathematical logic: the shift from the intensional to the extensional viewpoint, the development of type distinctions, the logical vs. the iterative conception of set, and the emergence of various concepts and principles as distinctively set-theoretic rather than purely logical. Here there is a loose analogy with Tarski's recursive definition of truth for formal languages: The mathematical interest lies mainly in the procedure of recursion and the attendant formal semantics in model theory, whereas the philosophical interest lies mainly in the basis of the recursion, truth and meaning at the level of basic predication. Circling back to the beginning, we shall see how central the empty set, the singleton, and the ordered pair were, after all.
APA, Harvard, Vancouver, ISO, and other styles
8

LADYMAN, JAMES, ØYSTEIN LINNEBO, and RICHARD PETTIGREW. "IDENTITY AND DISCERNIBILITY IN PHILOSOPHY AND LOGIC." Review of Symbolic Logic 5, no. 1 (November 17, 2011): 162–86. http://dx.doi.org/10.1017/s1755020311000281.

Full text
Abstract:
AbstractQuestions about the relation between identity and discernibility are important both in philosophy and in model theory. We show how a philosophical question about identity and discernibility can be ‘factorized’ into a philosophical question about the adequacy of a formal language to the description of the world, and a mathematical question about discernibility in this language. We provide formal definitions of various notions of discernibility and offer a complete classification of their logical relations. Some new and surprising facts are proved; for instance, that weak discernibility corresponds to discernibility in a language with constants for every object, and that weak discernibility is the most discerning nontrivial discernibility relation.
APA, Harvard, Vancouver, ISO, and other styles
9

Kuzmin, Egor V. "LTL-Specification of Counter Machines." Modeling and Analysis of Information Systems 28, no. 1 (March 24, 2021): 104–19. http://dx.doi.org/10.18255/1818-1015-2021-1-104-119.

Full text
Abstract:
The article is written in support of the educational discipline “Non-classical logics”. Within the framework of this discipline, the objects of study are the basic principles and constructive elements, with the help of which the formal construction of various non-classical propositional logics takes place. Despite the abstractness of the theory of non-classical logics, in which the main attention is paid to the strict mathematical formalization of logical reasoning, there are real practical areas of application of theoretical results. In particular, languages of temporal modal logics are widely used for modeling, specification, and verification (correctness analysis) of logic control program systems. This article demonstrates, using the linear temporal logic LTL as an example, how abstract concepts of non-classical logics can be reƒected in practice in the field of information technology and programming. We show the possibility of representing the behavior of a software system in the form of a set of LTL-formulas and using this representation to verify the satisfiability of program system properties through the procedure of proving the validity of logical inferences, expressed in terms of the linear temporal logic LTL. As program systems, for the specification of the behavior of which the LTL logic will be applied, Minsky counter machines are considered. Minsky counter machines are one of the ways to formalize the intuitive concept of an algorithm. They have the same computing power as Turing machines. A counter machine has the form of a computer program written in a high-level language, since it contains variables called counters, and conditional and unconditional jump operators that allow to build loop constructions. It is known that any algorithm (hypothetically) can be implemented in the form of a Minsky three-counter machine.
APA, Harvard, Vancouver, ISO, and other styles
10

RABE, FLORIAN. "A logical framework combining model and proof theory." Mathematical Structures in Computer Science 23, no. 5 (March 1, 2013): 945–1001. http://dx.doi.org/10.1017/s0960129512000424.

Full text
Abstract:
Mathematical logic and computer science have driven the design of a growing number of logics and related formalisms such as set theories and type theories. In response to this population explosion, logical frameworks have been developed as formal meta-languages in which to represent, structure, relate and reason about logics.Research on logical frameworks has diverged into separate communities, often with conflicting backgrounds and philosophies. In particular, two of the most important logical frameworks are the framework of institutions, from the area of model theory based on category theory, and the Edinburgh Logical Framework LF, from the area of proof theory based on dependent type theory. Even though their ultimate motivations overlap – for example in applications to software verification – they have fundamentally different perspectives on logic.In the current paper, we design a logical framework that integrates the frameworks of institutions and LF in a way that combines their complementary advantages while retaining the elegance of each of them. In particular, our framework takes a balanced approach between model theory and proof theory, and permits the representation of logics in a way that comprises all major ingredients of a logic: syntax, models, satisfaction, judgments and proofs. This provides a theoretical basis for the systematic study of logics in a comprehensive logical framework. Our framework has been applied to obtain a large library of structured and machine-verified encodings of logics and logic translations.
APA, Harvard, Vancouver, ISO, and other styles
11

AMELOOT, TOM J., JAN VAN DEN BUSSCHE, WILLIAM R. MARCZAK, PETER ALVARO, and JOSEPH M. HELLERSTEIN. "Putting logic-based distributed systems on stable grounds." Theory and Practice of Logic Programming 16, no. 4 (August 20, 2015): 378–417. http://dx.doi.org/10.1017/s1471068415000381.

Full text
Abstract:
AbstractIn the Declarative Networking paradigm, Datalog-like languages are used to express distributed computations. Whereas recently formal operational semantics for these languages have been developed, a corresponding declarative semantics has been lacking so far. The challenge is to capture precisely the amount of nondeterminism that is inherent to distributed computations due to concurrency, networking delays, and asynchronous communication. This paper shows how a declarative, model-based semantics can be obtained by simply using the well-known stable model semantics for Datalog with negation. We show that the model-based semantics matches previously proposed formal operational semantics.
APA, Harvard, Vancouver, ISO, and other styles
12

Guo, Dakai, and Wensheng Yu. "A Comprehensive Formalization of Propositional Logic in Coq: Deduction Systems, Meta-Theorems, and Automation Tactics." Mathematics 11, no. 11 (May 29, 2023): 2504. http://dx.doi.org/10.3390/math11112504.

Full text
Abstract:
The increasing significance of theorem proving-based formalization in mathematics and computer science highlights the necessity for formalizing foundational mathematical theories. In this work, we employ the Coq interactive theorem prover to methodically formalize the language, semantics, and syntax of propositional logic, a fundamental aspect of mathematical reasoning and proof construction. We construct four Hilbert-style axiom systems and a natural deduction system for propositional logic, and establish their equivalences through meticulous proofs. Moreover, we provide formal proofs for essential meta-theorems in propositional logic, including the Deduction Theorem, Soundness Theorem, Completeness Theorem, and Compactness Theorem. Importantly, we present an exhaustive formal proof of the Completeness Theorem in this paper. To bolster the proof of the Completeness Theorem, we also formalize concepts related to mappings and countability, and deliver a formal proof of the Cantor–Bernstein–Schröder theorem. Additionally, we devise automated Coq tactics explicitly designed for the propositional logic inference system delineated in this study, enabling the automatic verification of all tautologies, all internal theorems, and the majority of syntactic and semantic inferences within the system. This research contributes a versatile and reusable Coq library for propositional logic, presenting a solid foundation for numerous applications in mathematics, such as the accurate expression and verification of properties in software programs and digital circuits. This work holds particular importance in the domains of mathematical formalization, verification of software and hardware security, and in enhancing comprehension of the principles of logical reasoning.
APA, Harvard, Vancouver, ISO, and other styles
13

HUET, GÉRARD. "Special issue on ‘Logical frameworks and metalanguages’." Journal of Functional Programming 13, no. 2 (March 2003): 257–60. http://dx.doi.org/10.1017/s0956796802004549.

Full text
Abstract:
There is both a great unity and a great diversity in presentations of logic. The diversity is staggering indeed – propositional logic, first-order logic, higher-order logic belong to one classification; linear logic, intuitionistic logic, classical logic, modal and temporal logics belong to another one. Logical deduction may be presented as a Hilbert style of combinators, as a natural deduction system, as sequent calculus, as proof nets of one variety or other, etc. Logic, originally a field of philosophy, turned into algebra with Boole, and more generally into meta-mathematics with Frege and Heyting. Professional logicians such as Gödel and later Tarski studied mathematical models, consistency and completeness, computability and complexity issues, set theory and foundations, etc. Logic became a very technical area of mathematical research in the last half century, with fine-grained analysis of expressiveness of subtheories of arithmetic or set theory, detailed analysis of well-foundedness through ordinal notations, logical complexity, etc. Meanwhile, computer modelling developed a need for concrete uses of logic, first for the design of computer circuits, then more widely for increasing the reliability of sofware through the use of formal specifications and proofs of correctness of computer programs. This gave rise to more exotic logics, such as dynamic logic, Hoare-style logic of axiomatic semantics, logics of partial values (such as Scott's denotational semantics and Plotkin's domain theory) or of partial terms (such as Feferman's free logic), etc. The first actual attempts at mechanisation of logical reasoning through the resolution principle (automated theorem proving) had been disappointing, but their shortcomings gave rise to a considerable body of research, developing detailed knowledge about equational reasoning through canonical simplification (rewriting theory) and proofs by induction (following Boyer and Moore successful integration of primitive recursive arithmetic within the LISP programming language). The special case of Horn clauses gave rise to a new paradigm of non-deterministic programming, called Logic Programming, developing later into Constraint Programming, blurring further the scope of logic. In order to study knowledge acquisition, researchers in artificial intelligence and computational linguistics studied exotic versions of modal logics such as Montague intentional logic, epistemic logic, dynamic logic or hybrid logic. Some others tried to capture common sense, and modeled the revision of beliefs with so-called non-monotonic logics. For the careful crafstmen of mathematical logic, this was the final outrage, and Girard gave his anathema to such “montres à moutardes”.
APA, Harvard, Vancouver, ISO, and other styles
14

Torrens Urrutia, Adrià. "Lógica difusa para una descripción de la gramática de las lenguas naturales." Triangle, no. 16 (June 23, 2020): 73. http://dx.doi.org/10.17345/triangle16.73-81.

Full text
Abstract:
Defining the natural language and its gradient phenomena force us to look for formal tools that can represent the bases of a grammar with degrees of grammaticality. The mathematical and formal models are often used in linguistics. And yet, fuzzy logic has not received all the attention it deserves as a tool to explain the natural language processing. Here, we show the theoretical bases that have led us to treat the natural language (NL) inputs gradually. The basis of fuzzy logic for NL are explained here as a tool capable of defining non-discrete values, therefore gradual or fuzzy. A Property Grammar will give the rules of the fuzzy grammar.
APA, Harvard, Vancouver, ISO, and other styles
15

Poythress, Vern. "A semiotic analysis of multiple systems of logic: using tagmemic theory to assess the usefulness and limitations of formal logics, and to produce a mathematical lattice model including multiple systems of logic." Semiotica 2022, no. 244 (January 1, 2022): 145–62. http://dx.doi.org/10.1515/sem-2020-0051.

Full text
Abstract:
Abstract Tagmemic theory as a semiotic theory can be used to analyze multiple systems of logic and to assess their strengths and weaknesses. This analysis constitutes an application of semiotics and also a contribution to understanding of the nature of logic within the context of human meaning. Each system of logic is best adapted to represent one portion of human rationality. Acknowledging this correlation between systems and their targets helps explain the usefulness of more than one system. Among these systems, the two-valued system of classical logic takes its place. All the systems of logic can be incorporated into a complex mathematical model that has a place for each system and that represents a larger whole in human reasoning. The model can represent why tight formal systems of logic can be applied in some contexts with great success, but in other contexts are not directly applicable. The result suggests that human reasoning is innately richer than any one formal system of logic.
APA, Harvard, Vancouver, ISO, and other styles
16

Askar, Leskhan, Asset Kuranbek, Dinara Pernebekova, and Kamshat Kindikbaeva. "Abu Nasr Al-Farabi’s Science of Logic." Al-Farabi 74, no. 2 (June 30, 2021): 34–45. http://dx.doi.org/10.48010/2021.2/1999-5911.03.

Full text
Abstract:
In modern conditions of dynamically developing knowledge, the demand for correct thinking remains an urgent problem. Unfortunately, the course of logic, and especially the history of logic is excluded from the educational program of preparation of many specialties. Although the knowledge of logical science, its laws, techniques and operations in the practical and theoretical work of not only the humanities, but also representatives of technical, natural and mathematical specialties can hardly be overestimated. In this article, the authors' idea is aimed at filling the existing gap and presenting an analysis of the history of its formation in the format of the history of culture and the history of philosophy, using a comparative approach. In the history of logical science, a significant place is occupied by the logic of al-Farabi, who in the Middle Ages left a bright, indelible mark in the field of many sciences with his original ideas. The article analyzes the contribution made to the development of the science of logic by al-Farabi, which in turn contributes to the formation of a culture of thinking and comes to the following conclusions: First, al-Farabi closely links formal logic with the science of language. However, he does not completely identify thinking and language, noting that language is ethnic, while thinking is universal. Secondly, the Second Teacher considers dialectics as belonging to formal logic, in it he sees a form and means of cooperation between people, etc.
APA, Harvard, Vancouver, ISO, and other styles
17

Brosa-Rodríguez, Antoni, M. Dolores Jiménez-López, and Adrià Torrens-Urrutia. "Exploring the complexity of natural languages: A fuzzy evaluative perspective on Greenberg universals." AIMS Mathematics 9, no. 1 (2023): 2181–214. http://dx.doi.org/10.3934/math.2024109.

Full text
Abstract:
<abstract><p>In this paper, we introduced a fuzzy model for calculating complexity based on universality, aiming to measure the complexity of natural languages in terms of the degree of universality exhibited in their rules. We validated the model by conducting experiments on a corpus of 143 languages obtained from Universal Dependencies 2.11. To formalize the linguistic universals proposed by Greenberg, we employed the Grew tool to convert them into a formal rule representation. This formalization enables the verification of universals within the corpus. By analyzing the corpus, we extracted the occurrences of each universal in different languages. The obtained results were used to define a fuzzy model that quantifies the degree of universality and complexity of both the Greenberg universals and the languages themselves, employing the mathematical theory of evaluative expressions from fuzzy natural logic (FNL). Our analysis revealed an inversely proportional relationship between the degree of universality and the level of complexity observed in the languages. The implications of our findings extended to various applications in the theoretical analysis and computational treatment of languages. In addition, the proposed model offered insights into the nature of language complexity, providing a valuable framework for further research and exploration.</p></abstract>
APA, Harvard, Vancouver, ISO, and other styles
18

Heath, Joseph. "Is Language a Game?" Canadian Journal of Philosophy 26, no. 1 (March 1996): 1–28. http://dx.doi.org/10.1080/00455091.1996.10717442.

Full text
Abstract:
Recent developments in game theory have shown that the mathematical models of action so widely admired in the study of economics are in fact only particular instantiations of a more general theoretical framework. In the same way that Aristotelian logic was ‘translated’ into the more general and expressive language of predicate logic, the basic action theoretic underpinnings of modern economics have now been articulated within the more comprehensive language of game theory. But precisely because of its greater generality and expressive power, game theory has again revived the temptation to apply formal models of action to every domain of social life. This movement has been fuelled by some notable successes. Game theory has provided useful insights into the logic of collective action in the theory of public goods, and strategic models of voting have illustrated important aspects of institutional decision-making. But this extension of formal models into every area of social interaction has also encountered significant difficulties, despite the fact that contemporary decision theory has weakened its basic assumptions to the point where it teeters constantly on the brink of vacuity.
APA, Harvard, Vancouver, ISO, and other styles
19

Gabbay, Murdoch J. "Foundations of Nominal Techniques: Logic and Semantics of Variables in Abstract Syntax." Bulletin of Symbolic Logic 17, no. 2 (June 2011): 161–229. http://dx.doi.org/10.2178/bsl/1305810911.

Full text
Abstract:
AbstractWe are used to the idea that computers operate on numbers, yet another kind of data is equally important: the syntax of formal languages, with variables, binding, and alpha-equivalence. The original application of nominal techniques, and the one with greatest prominence in this paper, is to reasoning on formal syntax with variables and binding.Variables can be modelled in many ways: for instance as numbers (since we usually take countably many of them); as links (since they may ‘point’ to a binding site in the term, where they are bound); or as functions (since they often, though not always, represent ‘an unknown’).None of these models is perfect. In every case for the models above, problems arise when trying to use them as a basis for a fully formal mechanical treatment of formal language. The problems are practical—but their underlying cause may be mathematical.The issue is not whether formal syntax exists, since clearly it does, so much as what kind of mathematical structure it is. To illustrate this point by a parody, logical derivations can be modelled using a Gödel encoding (i.e., injected into the natural numbers). It would be false to conclude from this that proof-theory is a branch of number theory and can be understood in terms of, say, Peano's axioms. Similarly, as it turns out, it is false to conclude from the fact that variables can be encoded e.g., as numbers, that the theory of syntax-with-binding can be understood in terms of the theory of syntax-without-binding, plus the theory of numbers (or, taking this to a logical extreme, purely in terms of the theory of numbers). It cannot; something else is going on. What that something else is, has not yet been fully understood.In nominal techniques, variables are an instance of names, and names are data. We model names using urelemente with properties that, pleasingly enough, turn out to have been investigated by Fraenkel and Mostowski in the first half of the 20th century for a completely different purpose than modelling formal language. What makes this model really interesting is that it gives names distinctive properties which can be related to useful logic and programming principles for formal syntax.Since the initial publications, advances in the mathematics and presentation have been introduced piecemeal in the literature. This paper provides in a single accessible document an updated development of the foundations of nominal techniques. This gives the reader easy access to updated results and new proofs which they would otherwise have to search across two or more papers to find, and full proofs that in other publications may have been elided. We also include some new material not appearing elsewhere.
APA, Harvard, Vancouver, ISO, and other styles
20

Parisi, Luciana. "Interactive Computation and Artificial Epistemologies." Theory, Culture & Society 38, no. 7-8 (October 19, 2021): 33–53. http://dx.doi.org/10.1177/02632764211048548.

Full text
Abstract:
What is algorithmic thought? It is not possible to address this question without first reflecting on how the Universal Turing Machine transformed symbolic logic and brought to a halt the universality of mathematical formalism and the biocentric speciation of thought. The article draws on Sylvia Wynter’s discussion of the sociogenic principle to argue that both neurocognitive and formal models of automated cognition constitute the epistemological explanations of the origin of the human and of human sapience. Wynter’s argument will be related to Gilbert Simondon’s reflections on ‘technical mentality’ to consider how socio-techno-genic assemblages can challenge the biocentricism and the formalism of modern epistemology. This article turns to ludic logic as one possible example of techno-semiotic languages as a speculative overturning of sociogenic programming. Algorithmic rules become technique-signs coinciding not with classic formalism but with interactive localities without re-originating the universality of colonial and patriarchal cosmogony.
APA, Harvard, Vancouver, ISO, and other styles
21

Owe, Olaf, and Gerardo Schneider. "Formal languages and analysis of contract-oriented software." Journal of Logic and Algebraic Programming 78, no. 5 (May 2009): 291–92. http://dx.doi.org/10.1016/j.jlap.2009.02.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Lupenko, Serhii A., Volodymyr V. Pasichnyk, and Nataliia E. Kunanets. "AXIOMATIC-DEDUCTIVE STRATEGY FOR IT DISCIPLINE CONTENT FORMATION." Information Technologies and Learning Tools 73, no. 5 (October 22, 2019): 149–60. http://dx.doi.org/10.33407/itlt.v73i5.2536.

Full text
Abstract:
The paper presents the axiomatic-deductive strategy of organizing the content of an academic discipline with the help of ontological approach in the e-learning systems in the field of information technologies. The authors have taken into account that the necessary property of the system of axiomatic statements is their consistency. On the basis of axiomatic-deductive strategy, new approaches to the formation of the discipline content are proposed. It is proved that the system of true statements of an academic discipline is based on its terminology-conceptual apparatus, in particular, axiomatic statements. The developed mathematical structures that describe the axiomatic-deductive substrategy of the organization of the academic discipline general statements and the taxonomically oriented substrategy of the deployment of the academic discipline content are presented in the article. This ensures the transition from the content form of representation of the set of statements of the academic discipline to its presentation by means of artificial languages of mathematical logic. The use of descriptive logic ensures the formalization of the procedure for displaying an axiomatic informal system in an axiomatic formal system. The mathematical structures describe and detail the abstract logical-semantic core of the academic discipline in the form of a group of axiomatic systems. It is noted that the basic core of the content of academic discipline contains its basic concepts and judgments. This ensures a strictly logical transition from abstract general concepts and statements to the concepts and assertions of the lower level of universality and abstraction. It is noted that in order to accommodate the content of an academic discipline is advisable to develop a taxonomically oriented sub-strategy based on the multiple application of operations of general concept division. The mathematical structures allow for analysis of a generalized structure of interactions between the verbal level of the description of the academic discipline subject area, the formal level of description of the subject area and the description of the subject area at the level of computer ontology, which is implemented through the formalization, interpretation, encoding and decoding in the computer-ontology development environment. As an example of the application of the proposed axiomatic-deductive strategy, the elements of the glossary and taxonomies of the concepts of the discipline "Computer Logic", which are embodied in the Protégé environment with the help of OWL ontology description language have been developed.
APA, Harvard, Vancouver, ISO, and other styles
23

RAHEEM Tunde Rasheed and SAM-KAYODE Christianah Olajumoke (Ph. D). "The Use of Truth Table, Logical Reasoning and Logic Gate in Teaching and Learning Process." International Journal of Latest Technology in Engineering Management & Applied Science 13, no. 6 (June 28, 2024): 1–12. http://dx.doi.org/10.51583/ijltemas.2024.130601.

Full text
Abstract:
The concept of truth table is based on the content to analyze in teaching and learning logical reasoning and logic gate, which is a visual representation of possible combination of input and output information of Boolean prepositions in logical reasoning and Boolean functions in logic gate plotted into a table. It adopts Boolean algebra for problem solving as a method or science of reasoning, or ability to argue and convince. In the teaching and learning processes of logic, formal and informal reasoning tasks are used in a variety of ways, including using symbols and entirely in plain language without symbols where symbolic logics are commonly referred to as mathematical logic. This paper therefore considers the Use of Truth Table, Logical Reasoning and Logic Gate in Teaching and Learning Process It highlighted the benefits of using the truth table and compared logical reasoning and logic gate connectives and concluded that, there exist similarities, differences and peculiarities in the interactive features of the content of the truth table. The paper suggested that teachers should vividly state the similarities, differences and peculiarities in the interactive features of truth table in logical reasoning and logic gate while incorporating thought in teaching and learning process to avoid confusion.
APA, Harvard, Vancouver, ISO, and other styles
24

CRISTIÁ, MAXIMILIANO, GIANFRANCO ROSSI, and CLAUDIA FRYDMAN. "Adding partial functions to Constraint Logic Programming with sets." Theory and Practice of Logic Programming 15, no. 4-5 (July 2015): 651–65. http://dx.doi.org/10.1017/s1471068415000290.

Full text
Abstract:
AbstractPartial functions are common abstractions in formal specification notations such as Z, B and Alloy. Conversely, executable programming languages usually provide little or no support for them. In this paper we propose to add partial functions as a primitive feature to a Constraint Logic Programming (CLP) language, namely {log}. Although partial functions could be programmed on top of {log}, providing them as first-class citizens adds valuable flexibility and generality to the form of set-theoretic formulas that the language can safely deal with. In particular, the paper shows how the {log} constraint solver is naturally extended in order to accommodate for the new primitive constraints dealing with partial functions. Efficiency of the new version is empirically assessed by running a number of non-trivial set-theoretical goals involving partial functions, obtained from specifications written in Z.
APA, Harvard, Vancouver, ISO, and other styles
25

TOURVILLE, NICHOLAS, and ROY T. COOK. "EMBRACING THE TECHNICALITIES: EXPRESSIVE COMPLETENESS AND REVENGE." Review of Symbolic Logic 9, no. 2 (April 11, 2016): 325–58. http://dx.doi.org/10.1017/s175502031600006x.

Full text
Abstract:
AbstractThe Revenge Problem threatens every approach to the semantic paradoxes that proceeds by introducing nonclassical semantic values. Given any such collection Δ of additional semantic values, one can construct a Revenge sentence:This sentence is either false or has a value in Δ.TheEmbracing Revengeview, developed independently by Roy T. Cook and Phlippe Schlenker, addresses this problem by suggesting that the class of nonclassical semantic values is indefinitely extensible, with each successive Revenge sentence introducing a new ‘pathological’ semantic value into the discourse. The view is explicitly motivated in terms of the idea that every notion thatseemsto be expressible (e.g., “has a value in Δ”, for any definite collection of semantic values Δ) should, if at all possible,beexpressible. Extant work on the Embracing Revenge view has failed to live up to this promise, since the formal languages developed within such work are expressively impoverished. We rectify this here by developing a much richer formal language, and semantics for that language, and we then prove an extremely powerful expressive completeness result for the system in question.
APA, Harvard, Vancouver, ISO, and other styles
26

Akhmedova, L. Sh, A. A. Magomedova, and R. T. Radjabova. "Application of logical and mathematical methods for the analysis of environmental information." South of Russia: ecology, development 17, no. 4 (December 30, 2022): 206–11. http://dx.doi.org/10.18470/1992-1098-2022-4-206-211.

Full text
Abstract:
Aim. Evaluation of the possibilities of mathematical logic and logical‐mathematical methods in the description of complex natural systems in simple and clear constructions, as they act as a language, special research methods, a source of ideas and concepts in natural science.Discussion. The article discusses the possibilities and advantages of logical and mathematical methods in the analysis of natural science information, and, in particular, environmental data. It gives a comparative overview of logical and mathematical constructions in the formation of scientific thinking. The list of the most common mathematical and logical symbols for the formalised recording of complex ecological systems is provided, together with examples of the use of logical and mathematical formulas as formal and implicative statements in the brief recording of natural science information.Conclusion. The research conducted does not convincingly indicate unambiguous advantages of using the symbols of mathematical logic rather than verbal presentation. However, since ecology deals with the description of arch‐complex systems, including operations of formalised problem statement, the formalised structuring of ecosystems, the grouping of ecosystems according to measures of their similarity‐difference or inclusion–intersection, and the classification of selected ecosystems in a certain specified group, the need for widespread application of logical and mathematical research is indisputable.
APA, Harvard, Vancouver, ISO, and other styles
27

Khan, Wilayat, Farrukh Aslam Khan, Abdelouahid Derhab, and Adi Alhudhaif. "CoCEC: An Automatic Combinational Circuit Equivalence Checker Based on the Interactive Theorem Prover." Complexity 2021 (May 25, 2021): 1–12. http://dx.doi.org/10.1155/2021/5525539.

Full text
Abstract:
Checking the equivalence of two Boolean functions, or combinational circuits modeled as Boolean functions, is often desired when reliable and correct hardware components are required. The most common approaches to equivalence checking are based on simulation and model checking, which are constrained due to the popular memory and state explosion problems. Furthermore, such tools are often not user-friendly, thereby making it tedious to check the equivalence of large formulas or circuits. An alternative is to use mathematical tools, called interactive theorem provers, to prove the equivalence of two circuits; however, this requires human effort and expertise to write multiple output functions and carry out interactive proof of their equivalence. In this paper, we (1) define two simple, one formal and the other informal, gate-level hardware description languages, (2) design and develop a formal automatic combinational circuit equivalence checker (CoCEC) tool, and (3) test and evaluate our tool. The tool CoCEC is based on human-assisted theorem prover Coq, yet it checks the equivalence of circuit descriptions purely automatically through a human-friendly user interface. It either returns a machine-readable proof (term) of circuits’ equivalence or a counterexample of their inequality. The interface enables users to enter or load two circuit descriptions written in an easy and natural style. It automatically proves, in few seconds, the equivalence of circuits with as many as 45 variables (3.5 × 10 13 states). CoCEC has a mathematical foundation, and it is reliable, quick, and easy to use. The tool is intended to be used by digital logic circuit designers, logicians, students, and faculty during the digital logic design course.
APA, Harvard, Vancouver, ISO, and other styles
28

EKONG, JOSEPH T. "A Ratiocinative Study and Assessment of W. V. O. Quine’s “Criterion of Ontological Commitment”." International Journal of Philosophy 1, no. 1 (October 7, 2022): 41–58. http://dx.doi.org/10.47941/ijp.1052.

Full text
Abstract:
Purpose: This work has three main objectives: Firstly, it offers an elucidation of the notion of ontological commitment. Secondly, it assesses the adequacy of the criterion of ontological commitment for different languages. Thirdly, it offers some speculative and evaluative remarks regarding the significance of Quine’s criterion of ontological commitment. Many ontologists, within the analytic tradition, often appeal to Quine's criterion of ontological commitment, when debating whether an assertion or theory implies the existence of a certain entity. Regarding his goal in formulating this criterion, he says that the criterion does not aim to help us discover what it is that there is, but only what a theory says there is: “I look to variables and quantification for evidence as to what a theory says that there is, not for evidence as to what there is” (Quine, 1960: 225). Its most popular formulation, using textual evidence from Quine's oeuvre, is: “To be is to be the value of a bound variable,” (Quine, 1961: 15). However, this formulation is susceptible to gross misunderstanding, especially if one is influenced by the formalities and technical maneuvers of model theory. In mathematical logic, model theory is the study of the relationship between formal theories (a collection of sentences in a formal language expressing statements about a mathematical structure), and their models (those structures in which the statements of the theory hold). Model theory is a branch of mathematical logic where we study mathematical structures by considering the first-order sentences true in those structures and the sets definable by first-order formulas. Model theory studies the relations between sentences of a formal language and the interpretations (or ‘structures’) which make these sentences true or false. It offers precise definitions of truth, logical truth and consequence, meanings and modalities. Methodology: This work is expository, analytic, critical and evaluative in its methodology. Of course, there are familiar philosophical problems which are within the discursive framework of ‘ontology,’ often phrased by asking if something or some category of things are “real,” or whether “they exist,” concretely. An outstanding example is provided by the traditional problem of universals, which issues in the nominalist-realist controversy, as to the real existence of universals, or of abstract entities such as classes (in the mathematical sense) or propositions (in the abstract sense, referring to the content of an assertion in abstraction from the particular words used to convey it). Results: In as much as one might agree with Quine’s Criterion of Ontological Commitment, one might also opine that it is nonetheless a feature of first-order language (i.e. the language embodied in first-order logic; a symbolized reasoning process comprising relations, functions and constants, in which each sentence or statement is broken down into a subject and a predicate. In this regard, the predicate modifies or defines the properties of the subject) that there should be an exact correspondence between the ontological commitments carried by a sentence and the objects that must be counted among the values of the variables in order for the sentence to be true. However, this in itself is not a reason for thinking that such a feature will generalize beyond first-order languages. It is possible for Quine’s Criterion to degenerate, when the language contains atomic predicates expressing extrinsic properties. Unique Contribution to theory, practice and policy: Based on Quine’s analysis, a theory is committed to those and only those entities that in the last analysis serve as the values of its bound variables. Thus, ordinary first-order theory commits one to an ontology only of individuals (particulars), whereas higher order logic commits one to the existence of sets, i.e. of collections of definite and distinct entities (or, alternatively, of properties and relations). Likewise, if bound first-order variables are assumed to range over sets (as they do in set theory), a commitment to the existence of these sets is incurred. Admittedly, the precise import of Quine’s criterion of ontological commitment, however, is not completely clear, nor is it clear in what other sense one is perhaps committed by a theory to those entities that are named or otherwise referred to in it, but not quantified over in it. However, it despite its limitations, it has made is possible for one to measure the ontological cost of theories, an important component in deciding which theories to accept, thus offering a partial foundation for theory choice.
APA, Harvard, Vancouver, ISO, and other styles
29

Shiyan, Taras A. "On the problem of describing semantic structures and semantic activity in formal mathematics and logic." Philosophy Journal 16, no. 2 (2023): 26–32. http://dx.doi.org/10.21146/2072-0726-2023-16-2-26-32.

Full text
Abstract:
The text considers the impossibility of abstracting away from the sense of formal con­structions in logical and mathematical researches. The validity of the application of the “formal methodology” is allowed only after some system of conventional notations and agreements has been accepted. The context determined by such agreements is called formal. A correlation of constructions and results obtained by formal methods within sev­eral formal contexts is impossible without a consideration of the various semantic aspects of the correlated formal constructions. The author calls such correlations intercontextual. The paper examines two examples of such intercontextual comparisons to demonstrate the necessity of taking into account different semantic components of the compared for­mal constructions. In the context of these conclusions, the author raises the question of the structure and origin of some senses of the “symbols” used in formal constructions and of the “sequences of symbols” constructed from them. The author identifies three main sources of the semantic load carried by formal constructions. Firstly, these are the various aspects of semiotic usage: first of all, the general cultural and general profes­sional semiotic skills of the “interpreter”. Secondly, it is the sense given to formal con­structions by verbal comments, descriptions of the construction process and the associ­ated knowledge of the “interpreter”. Thirdly, these are the senses set by the formal con­structions themselves: at the stage of defining a formal language and at the stage of con­structing a formal deductive or semantic system. The author also considers the fallacy of the assumption of the existence of some universal “global intuition” associated with the very possibility of formal methodology.
APA, Harvard, Vancouver, ISO, and other styles
30

ШКАРБАН, Інна. "LINGUISTIC ASPECT OF MODALITY IN MODERN MATH DISCOURSE IN ENGLISH." Проблеми гуманітарних наук. Серія Філологія, no. 49 (June 8, 2022): 231–36. http://dx.doi.org/10.24919/2522-4565.2022.49.33.

Full text
Abstract:
The article reveals linguistic aspect of modality in modern math discourse in English, critically outlines a number of actual problematic issues in the area, such as the distinction between epistemic modality and evidentiality marked by formal logics philosophical grounding. General reference to previous scholarly activity in math modality research proves that it is largely based on propositional aspects of meaning. The math text corpus analysis aims to extract a set of modalities that are indispensable for formulating modal deductive reasoning. However, from a linguistic perspective academic math discourse requires natural language premise selection in the processes of mathematical reasoning and argumentation. It is presumed that two different self-attention cognition layers are focused at the same time on the proper classical symbolic logic and mathematical elements (formal language), while the other attends to natural language. Defining the semantic meanings of math discourse modality markers involves the interpretation phase. Thus, objectivity is generally associated with evidential adverbs which are markers of the evidence verification concerning the speaker’s assessment of the truth value of the proposition. Modal auxiliaries of high, medium and low modality, semimodal verbs and conditionals involve ascribing a justification value in the set of possible logical inference making. The formal logical structure of mathematical reasoning explains the non-intuitive possibility of a deductive proof. It has been grounded that a linguistic category of modality in math discourse indispensably presupposes the universal truth of knowledge, high level of logical formalization in propositional verification status, formulaic nature of the argumentation, i.e. synthesis of hypothetical preconditions, theoretical knowledge and subjectivity of reasoning leading to a new hypothesis verification and visual exemplification of the empirical deductive processes in particular by linguistic means of modality expression.
APA, Harvard, Vancouver, ISO, and other styles
31

Harnik, Victor, and Michael Makkai. "Lambek's categorical proof theory and Läuchli's abstract realizability." Journal of Symbolic Logic 57, no. 1 (March 1992): 200–230. http://dx.doi.org/10.2307/2275186.

Full text
Abstract:
In this paper we give an introduction to categorical proof theory, and reinterpret, with improvements, Läuchli's work on abstract realizability restricted to propositional logic (but see [M1] for predicate logic). Partly to make some points of a foundational nature, we have included a substantial amount of background material. As a result, the paper is (we hope) readable with a knowledge of just the rudiments of category theory, the notions of category, functor, natural transformation, and the like. We start with an extended introduction giving the background, and stating what we do with a minimum of technicalities.In three publications [L1, 2, 3] published in the years 1968, 1969 and 1972, J. Lambek gave a categorical formulation of the notion of formal proof in deductive systems in certain propositional calculi. The theory is also described in the recent book [LS]. See also [Sz].The basic motivation behind Lambek's theory was to place proof theory in the framework of modern abstract mathematics. The spirit of the latter, at least for the purposes of the present discussion, is to organize mathematical objects into mathematical structures. The specific kind of structure we will be concerned with is category.In Lambek's theory, one starts with an arbitrary theory in any one of several propositional calculi. One has the (formal) proofs (deductions) in the given theory of entailments A ⇒ B, with A and B arbitrary formulas. One introduces an equivalence relation on proofs under which, in particular, equivalent proofs are proofs of the same entailment; equivalence of proofs is intended to capture the idea of the proofs being only inessentially different. One forms a category whose objects are the formulas of the underlying language of the theory, and whose arrows from A to B, with the latter arbitrary formulas, are the equivalence classes of formal proofs of A ⇒ B.
APA, Harvard, Vancouver, ISO, and other styles
32

VENNEKENS, JOOST, MARC DENECKER, and MAURICE BRUYNOOGHE. "CP-logic: A language of causal probabilistic events and its relation to logic programming." Theory and Practice of Logic Programming 9, no. 3 (May 2009): 245–308. http://dx.doi.org/10.1017/s1471068409003767.

Full text
Abstract:
AbstractThis paper develops a logical language for representing probabilistic causal laws. Our interest in such a language is two-fold. First, it can be motivated as a fundamental study of the representation of causal knowledge. Causality has an inherent dynamic aspect, which has been studied at the semantical level by Shafer in his framework of probability trees. In such a dynamic context, where the evolution of a domain over time is considered, the idea of a causal law as something which guides this evolution is quite natural. In our formalization, a set of probabilistic causal laws can be used to represent a class of probability trees in a concise, flexible and modular way. In this way, our work extends Shafer's by offering a convenient logical representation for his semantical objects. Second, this language also has relevance for the area of probabilistic logic programming. In particular, we prove that the formal semantics of a theory in our language can be equivalently defined as a probability distribution over the well-founded models of certain logic programs, rendering it formally quite similar to existing languages such as ICL or PRISM. Because we can motivate and explain our language in a completely self-contained way as a representation of probabilistic causal laws, this provides a new way of explaining the intuitions behind such probabilistic logic programs: we can say precisely which knowledge such a program expresses, in terms that are equally understandable by a non-logician. Moreover, we also obtain an additional piece of knowledge representation methodology for probabilistic logic programs, by showing how they can express probabilistic causal laws.
APA, Harvard, Vancouver, ISO, and other styles
33

Ouazar, F., M. C. Boukala, and M. Ioualalen. "Business Process Modeled with BPMN and CTL Model Checking." International Journal on Cybernetics & Informatics 12, no. 5 (August 12, 2023): 157–69. http://dx.doi.org/10.5121/ijci.2023.120513.

Full text
Abstract:
Despite the richness of the BPMN language and its advantages for the specification of business processes, it remains a semi-formal language that does not allow rigorous verification of the specifications produced with it, and does not offer any methodological support to cover the verification phase. Therefore, several works have been proposed with the aim of describing the semantics of the BPMN language by a mathematical formalism. In this paper we address the issue of verifying BPMN models with an approach based on model-checking, where we focus on soundness, fairness, and safety properties. Thus by having a business process modeled by BPMN, a formal semantics for BPMN models based on Kripke structure will be provided for a formal verification of correctness. The properties are expressed with CTL (Computation Tree Logic) formulas. At the end, the model checker NuSMV is used for the verification of the formula.
APA, Harvard, Vancouver, ISO, and other styles
34

Kozyriev, Andrii, and Ihor Shubin. "The method of linear-logical operators and logical equations in information extraction tasks." INNOVATIVE TECHNOLOGIES AND SCIENTIFIC SOLUTIONS FOR INDUSTRIES, no. 1 (27) (July 2, 2024): 81–95. http://dx.doi.org/10.30837/itssi.2024.27.081.

Full text
Abstract:
Relational and logical methods of knowledge representation play a key role in creating a mathematical basis for information systems. Predicate algebra and predicate operators are among the most effective tools for describing information in detail. These tools make it easy to formulate formalized information, create database queries, and simulate human activity. In the context of the new need for reliable and efficient data selection, a problem arises in deeper analysis. Subject of the study is the theory of quantum linear equations based on the algebra of linear predicate operations, the formal apparatus of linear logic operators and methods for solving logical equations in information extraction tasks. Aim of the study is a developing of a method for using linear logic operators and logical equations to extract information. This approach can significantly optimize the process of extracting the necessary information, even in huge databases. Main tasks: analysis of existing approaches to information extraction; consideration of the theory of linear logic operators; study of methods for reducing logic to an algebraic form; analysis of logical spaces and the algebra of finite predicate actions and the theory of linear logic operators. The research methods involve a systematic analysis of the mathematical structure of the algebra of finite predicates and predicate functions to identify the key elements that affect the query formation process. The method of using linear logic operators and logical equations for information extraction is proposed. The results of the study showed that the method of using linear logic operators and logical equations is a universal and adaptive tool for working with algebraic data structures. It can be applied in a wide range of information extraction tasks and prove its value as one of the possible methods of information processing. Conclusion. The paper investigates formal methods of intelligent systems, in particular, ways of representing knowledge in accordance with the peculiarities of the field of application and the language that allows encoding this knowledge for storage in computer memory. The proposed method can be implemented in the development of language interfaces for automated information access systems, in search engine algorithms, for logical analysis of information in databases and expert systems, as well as in performing tasks related to object recognition and classification.
APA, Harvard, Vancouver, ISO, and other styles
35

Ivanov, Ievgen, Mykola Nikitchenko, Andrii Kryvolap, and Artur Korniłowicz. "Simple-Named Complex-Valued Nominative Data – Definition and Basic Operations." Formalized Mathematics 25, no. 3 (October 1, 2017): 205–16. http://dx.doi.org/10.1515/forma-2017-0020.

Full text
Abstract:
Summary In this paper we give a formal definition of the notion of nominative data with simple names and complex values [15, 16, 19] and formal definitions of the basic operations on such data, including naming, denaming and overlapping, following the work [19]. The notion of nominative data plays an important role in the composition-nominative approach to program formalization [15, 16] which is a development of composition programming [18]. Both approaches are compared in [14]. The composition-nominative approach considers mathematical models of computer software and data on various levels of abstraction and generality and provides mathematical tools for reasoning about their properties. In particular, nominative data are mathematical models of data which are stored and processed in computer systems. The composition-nominative approach considers different types [14, 19] of nominative data, but all of them are based on the name-value relation. One powerful type of nominative data, which is suitable for representing many kinds of data commonly used in programming like lists, multidimensional arrays, trees, tables, etc. is the type of nominative data with simple (abstract) names and complex (structured) values. The set of nominative data of given type together with a number of basic operations on them like naming, denaming and overlapping [19] form an algebra which is called data algebra. In the composition-nominative approach computer programs which process data are modeled as partial functions which map nominative data from the carrier of a given data algebra (input data) to nominative data (output data). Such functions are also called binominative functions. Programs which evaluate conditions are modeled as partial predicates on nominative data (nominative predicates). Programming language constructs like sequential execution, branching, cycle, etc. which construct programs from the existing programs are modeled as operations which take binominative functions and predicates and produce binominative functions. Such operations are called compositions. A set of binominative functions and a set of predicates together with appropriate compositions form an algebra which is called program algebra. This algebra serves as a semantic model of a programming language. For functions over nominative data a special computability called abstract computability is introduces and complete classes of computable functions are specified [16]. For reasoning about properties of programs modeled as binominative functions a Floyd-Hoare style logic [1, 2] is introduced and applied [12, 13, 8, 11, 9, 10]. One advantage of this approach to reasoning about programs is that it naturally handles programs which process complex data structures (which can be quite straightforwardly represented as nominative data). Also, unlike classical Floyd-Hoare logic, the mentioned logic allows reasoning about assertions which include partial pre- and post-conditions [11]. Besides modeling data processed by programs, nominative data can be also applied to modeling data processed by signal processing systems in the context of the mathematical systems theory [4, 6, 7, 5, 3].
APA, Harvard, Vancouver, ISO, and other styles
36

Blikle, Andrzej. "Three-Valued Predicates for Software Specification and Validation." Fundamenta Informaticae 14, no. 4 (April 1, 1991): 387–410. http://dx.doi.org/10.3233/fi-1991-14402.

Full text
Abstract:
Partial functions, hence also partial predicates, cannot be avoided in algorithms. However, in spite of the fact that partial functions have been formally introduced into the theory of software very early, partial predicates are still not quite commonly recognized. In many programming- and software-specification languages partial Boolean expressions are treated in a rather simplistic way: the evaluation of a Boolean sub-expression to an error leads to the evaluation of the hosting Boolean expression to an error and, in the consequence, to the abortion of the whole program. This technique is known as an eager evaluation of expressions. A more practical approach to the evaluation of expressions – gaining more interest today among both theoreticians and programming-language designers – is lazy evaluation. Lazily evaluated Boolean expressions correspond to (non-strict) three-valued predicates where the third value represents both an error and an undefinedness. On the semantic ground this leads to a three-valued propositional calculus, three-valued quantifiers and an appropriate logic. This paper is a survey-essay devoted to the discussion and the comparison of a few three-valued propositional and predicate calculi and to the discussion of the author’s claim that a two-valued logic, rather than a three-valued logic, is suitable for the treatment of programs with three-valued Boolean expressions. The paper is written in a formal but not in a formalized style. All discussion is carried on a semantic ground. We talk about predicates (functions) and a semantic consequence relation rather than about expressions and inference rules. However, the paper is followed by more formalized works which carry our discussion further on a formalized ground, and where corresponding formal logics are constructed and discussed.
APA, Harvard, Vancouver, ISO, and other styles
37

Triantafyllidis, Charalampos P., and Lazaros G. Papageorgiou. "An integrated platform for intuitive mathematical programming modeling using LaTeX." PeerJ Computer Science 4 (September 10, 2018): e161. http://dx.doi.org/10.7717/peerj-cs.161.

Full text
Abstract:
This paper presents a novel prototype platform that uses the same LaTeX mark-up language, commonly used to typeset mathematical content, as an input language for modeling optimization problems of various classes. The platform converts the LaTeX model into a formal Algebraic Modeling Language (AML) representation based on Pyomo through a parsing engine written in Python and solves by either via NEOS server or locally installed solvers, using a friendly Graphical User Interface (GUI). The distinct advantages of our approach can be summarized in (i) simplification and speed-up of the model design and development process (ii) non-commercial character (iii) cross-platform support (iv) easier typo and logic error detection in the description of the models and (v) minimization of working knowledge of programming and AMLs to perform mathematical programming modeling. Overall, this is a presentation of a complete workable scheme on using LaTeX for mathematical programming modeling which assists in furthering our ability to reproduce and replicate scientific work.
APA, Harvard, Vancouver, ISO, and other styles
38

Ovsyak, V. K., O. V. Ovsyak, and J. V. Petruszka. "ORDER AND ORDERING IN DISCRETE MATHEMATICS AND INFORMATICS." Ukrainian Journal of Information Technology 3, no. 1 (2021): 37–43. http://dx.doi.org/10.23939/ujit2021.03.037.

Full text
Abstract:
The available means of ordering and sorting in some important sections of discrete mathematics and computer science are studied, namely: in the set theory, classical mathematical logic, proof theory, graph theory, POST method, system of algorithmic algebras, algorithmic languages of object-oriented and assembly programming. The Cartesian product of sets, ordered pairs and ordered n-s, the description by means of set theory of an ordered pair, which are performed by Wiener, Hausdorff and Kuratowski, are presented. The requirements as for the relations that order sets are described. The importance of ordering in classical mathematical logic and proof theory is illustrated by the examples of calculations of the truth values of logical formulas and formal derivation of a formula on the basis of inference rules and substitution rules. Ordering in graph theory is shown by the example of a block diagram of the Euclidean algorithm, designed to find the greatest common divisor of two natural numbers. The ordering and sorting of both the instructions formed by two, three and four ordered fields and the existing ordering of instructions in the program of Post method are described. It is shown that the program is formed by the numbered instructions with unique instruction numbers and the presence of the single instruction with number 1. The means of the system of algorithmic algebras, which are used to perform the ordering and sorting in the algorithm theory, are illustrated. The operations of the system of algorithmic algebras are presented, which include Boolean algebra operations generalized to the three-digit alphabet and operator operations of operator algebra. The properties of the composition operation are described, which is intended to describe the orderings of the operators of the operator algebra in the system of algorithmic algebras. The orderings executed by means of algorithmic programming languages are demonstrated by the hypothetical application of the modern object-oriented programming language C#. The program must contain only one method Main () from which the program execution begins. The ARM microprocessor assembly program must have only one ENTRY directive from which the program execution begins.
APA, Harvard, Vancouver, ISO, and other styles
39

Borisov, Evgeny. "Analytic Philosophy." Philosophical anthropology 7, no. 1 (2021): 143–67. http://dx.doi.org/10.21146/2414-3715-2021-7-1-143-167.

Full text
Abstract:
The paper provides an overview of the most fundamental ideas representing analytic philosophy throughout its history from the beginning of 20th century up to now. The history of analytic philosophy is divided into two stages – the early and the contemporary ones. The main distinguishing features of early analytic philosophy are using mathematical logic as a tool of stating and solving philosophical problems, and critical attitude toward ‘metaphysics’, i.e., traditional and contemporary non-analytic philosophical theories. The genesis of analytic philosophy was closely related to the revolution in logic that led to the rise of mathematical logic, and it is no coincidence that some founders of analytic tradition (first of all Frege, Russell, and Carnap) were also prominent logicians. (But there were also authors and schools within early analytic philosophy whose researches were based on less formal tools such as classical logic and linguistic methods of analysis of language. Ordinary language philosophy is an example of this type of philosophy.) Using the new logic as a philosophical tool led to a huge number of new ideas and generated a new type of philosophical criticism that was implemented in a number of projects of ‘overcoming metaphysics’. These features constituted the methodological and thematic profile of early analytic philosophy. As opposed to the later, contemporary analytic philosophy cannot be characterized by a prevailing method or a set of main research topic. Its characteristic features are rather of historical, institutional, and stylistic nature. In the paper, early analytic philosophy is represented by Frege, Russell, early Wittgenstein, Vienna Circle (Schlick, Carnap etc.), and ordinary language philosophy (later Wittgenstein, Ryle, Austin, and Searle). Contemporary analytic philosophy is represented by Quine, and direct reference theory in philosophy of language (Kripke, Donnellan, Kaplan, and Putnam).
APA, Harvard, Vancouver, ISO, and other styles
40

Gurjanov, A. V., D. A. Zakoldaev, I. O. Zharinov, and O. O. Zharinov. "The Industry 4.0 technological and information processes cyber-modelling." Journal of Physics: Conference Series 2094, no. 4 (November 1, 2021): 042062. http://dx.doi.org/10.1088/1742-6596/2094/4/042062.

Full text
Abstract:
Abstract Cyber-modelling is the information models simulation process describing in a mathematical and formal logic languages (phenomenon models) how cyber-physical systems interaction mechanisms are united with different control laws and parameter values. The equation complexity represented in different levels of cyber-physical production systems hierarchy and non-equations of algebra, logic, end-subtraction, vector and matrices form in a discreet and uninterrupted times are defined with an aggregated number in the industrial automatics element control loop. The cyber-modelling is done for statistic and dynamic processes and equipment states being monitored in a virtual environment fixating actual in a time interval technological data. The cyber-modelling is done with integrated calculation equipment systems with parallel physical production processes of item manufacturing. The model time faster than physical processes let prognosticate the corrections modifying control signals and phase variables of cyber-physical systems united in an assembly conveyor. The cyber-modelling advantage is an expanded number of cycles to optimize the technological processes, which are calculated with integrated calculation systems using consecutive approximation method. They describe the cyber-modelling technology and propose the information models based on phenomenon cyber-physical production processes descriptions with general control theory terms, calculations and connection for hierarchy controlling structures.
APA, Harvard, Vancouver, ISO, and other styles
41

Метешкін, Костянтин Олександрович, and Максим Анатолійович Кухар. "АНАЛІЗ МОЖЛИВОСТІ ФОРМАЛІЗАЦІЇ ЗЕМЕЛЬНИХ ВІДНОСИН." Radioelectronic and Computer Systems, no. 2 (February 10, 2017): 33–37. http://dx.doi.org/10.32620/reks.2017.2.05.

Full text
Abstract:
This article shows the examples of the formal submission to the set-theoretic language and the language of logic predicate calculus and the theory of some categories of land relations. Necessity of mathematical methods application for formalizing land relations in Ukraine is explained. The resulting formalism in the future will be the foundation for creating a decision support system to resolved problems in the field of land management. The reason for the occurrence of such necessity has become imperfection of contemporary Ukrainian system of land relations, which in its framework does not allow the implementation of the rules and regulations of the land legislation of Ukraine.
APA, Harvard, Vancouver, ISO, and other styles
42

Gazzari, René. "Formal Theories of Occurrences and Substitutions." Bulletin of Symbolic Logic 28, no. 2 (June 2022): 261–63. http://dx.doi.org/10.1017/bsl.2021.53.

Full text
Abstract:
AbstractGazzari provides a mathematical theory of occurrences and of substitutions, which are a generalisation of occurrences constituting substitution functions. The dissertation focusses on term occurrences in terms of a first order language, but the methods and results obtained there can easily be carried over to arbitrary kinds of occurrences in arbitrary kinds of languages.The aim of the dissertation is twofold: first, Gazzari intends to provide an adequate formal representation of philosophically relevant concepts (not only of occurrences and substitutions, but also of substitution functions, of calculations as well as of intuitively given properties of the discussed entities) and to improve this way our understanding of these concepts; second, he intends to provide a formal exploration of the introduced concepts including the detailed development of the methods needed for their adequate treatment.The dissertation serves as a methodological fundament for consecutive research on topics demanding a precise treatment of occurrences and as a foundation for all scientific work dealing with occurrences only informally; the formal investigations are complemented by a brief survey of the development of the notion of occurrences in mathematics, philosophy and computer science.The notion of occurrences. Occurrences are determined by three aspects: an occurrence is always an occurrence of a syntactic entity (its shape) in a syntactic entity (its context) at a specific position. Context and shape can be any meaningful combination of well-known syntactic entities as, in logic, terms, formulae or formula trees. Gazzari’s crucial idea is to represent the position of occurrences by nominal forms, essentially as introduced by Schütte [2]. The nominal forms are a generalisation of standard syntactic entities in which so called nominal symbols $*_k$ may occur. The position of an occurrence is obtained by eliminating the intended shape in the context, which means to replace the intended shape by suitable nominal symbols.Standard occurrences. Central tool of the theory of nominal terms (nominal forms generalising standard terms) is the general substitution function mapping a nominal term $\mathtt {t}$ and a sequence $\vec {\mathtt {t}}$ of them to the result $\mathtt {t}[\vec {\mathtt {t}}]$ of replacing simultaneously the nominal symbols $*_k$ in the first argument by the respective entries $\mathtt {t}_k$ of the second argument.A triple $\mathfrak {o}=\langle {t,s,\mathtt {t}} \rangle $ is a standard occurrence, if an application of the general substitution function on the position $\mathtt {t}$ and the shape s results in the context t of that occurrence. As $*_0$ can occur more than once in $\mathtt {t}$ , arbitrary many single occurrences in the context t of the common shape s can be subsumed in $\mathfrak {o}$ .1 Gazzari illustrates the appropriateness of his approach by solving typical problems (counting formally the number of specific occurrences, deciding whether an occurrence lies within another) which are not solvable without a good theory of occurrences.Multi-shape occurrences. The multi-shape occurrences $\mathfrak {o}=\langle {t,\vec s,\mathtt {t}} \rangle $ are the generalisation of standard occurrences, where the shape $\vec s$ is a sequence of standard terms. Such occurrences subsume arbitrary non-overlapping single occurrences in the context t.Gazzari addresses the non-trivial identity of such occurrences and their independence. The latter represents formally the idea of non-overlapping occurrences and is a far-reaching generalisation of disjointness as discussed by Huet with respect to single occurrences. Independent occurrences can be merged into one occurrence; an occurrence can be split up into independent occurrences.Substitutions. A substitution $\mathbf s=\langle {t,\vec s,\mathtt {t},{\vec s}',t'} \rangle $ satisfies that both $\mathfrak {o}=\langle {t,\vec s,\mathtt {t}} \rangle $ and $\mathfrak {o}'=\langle {t',{\vec s}',\mathtt {t}} \rangle $ are occurrences such that the shapes have the same length. Such a substitution represents the replacement of $\vec s$ in t at $\mathtt {t}$ by $\vec s'$ resulting in $t'$ . This means that a substitution is understood as a process and not as a (specific type of a) function.Identity and independence are addressed again, using and extending the methods developed for occurrences; as before, independent substitutions can be merged and substitutions can be split up into sequences of independent substitutions. Substitutions are used to represent formally calculations (as found in everyday mathematics) and to investigate them.Sets of substitutions turn out to be set-theoretic functions mapping the affected occurrences $\mathfrak {o}$ and the inserted shapes ${\vec s}'$ to the result $t'$ of a substitution $\mathbf s$ . Such sets are called explicit substitution functions. In order to qualify functions which are usually understood as substitution functions (and which are not formulated in a theory of occurrences) as substitution functions, Gazzari develops the concept of an explication method transforming such functions into explicit substitution functions. The appropriateness and the (philosophical) limitations of this concept are illustrated with example functions.Conclusion. Gazzari’s theory of occurrences is strong (not restricted to single occurrences), canonical (nominal forms are a canonical generalisation of the underlying syntactic entities) and general (presupposing the grammar for the underlying syntactic entities, suitable nominal forms are easily defined and the theory of occurrences is immediately carried over). Another advantage is a kind of methodological pureness: positions are generalised syntactic entities (and not extraneous, as sequences of natural numbers) and can be treated, in particular, with the well-known methods developed for the underlying syntactic entities.Abstract prepared by René Gazzari.E-mail: rene.gazzari@uni-tuebingen.deURL: http://doi.org/10.15496/publikation-47553
APA, Harvard, Vancouver, ISO, and other styles
43

Qiao, Zebo, and Jianjun Yin. "Fuzzy Deep Medical Diagnostic System: Gray Relation Framework and the Guiding Functionalities for the Professional Sports Club Social Responsibility." Journal of Medical Imaging and Health Informatics 10, no. 5 (May 1, 2020): 1084–90. http://dx.doi.org/10.1166/jmihi.2020.2891.

Full text
Abstract:
Fuzzy deep medical diagnostic system based on gray relation framework and the guiding functionalities for the professional sports club social responsibility is proposed in this paper. Medical high-tech has two features, namely formal logic and mathematics. That is to say, they use formal logic to build the theoretical system, which requires that the principles of the medical science and technology are defined clearly in concept, the reasoning is rigorous and logical and its mathematics requires its pursuit of precision in the work, and a mathematical language to reveal the internal relations between the present images. Medical technology for tissue damage mild disease is often unchecked, when the patient’s own symptoms and feelings are often more accurate than the instrument. Inspired by this, this paper integrates the deep learning model to construct the intelligent diagnostic system. The gray relation is designed to improve the traditional CNN model and the revised algorithms also combine the sensitive data analysis framework. At the meanwhile, application scenario on the professional sports club social responsibility is demonstrated. Experimental results prove the effectiveness of the designed system. The diagnostic accuracy has reached 98.38% which performs better compared with the other state-of-the-art methodologies.
APA, Harvard, Vancouver, ISO, and other styles
44

DEAN, WALTER. "INCOMPLETENESS VIA PARADOX AND COMPLETENESS." Review of Symbolic Logic 13, no. 3 (May 23, 2019): 541–92. http://dx.doi.org/10.1017/s1755020319000212.

Full text
Abstract:
AbstractThis paper explores the relationship borne by the traditional paradoxes of set theory and semantics to formal incompleteness phenomena. A central tool is the application of the Arithmetized Completeness Theorem to systems of second-order arithmetic and set theory in which various “paradoxical notions” for first-order languages can be formalized. I will first discuss the setting in which this result was originally presented by Hilbert & Bernays (1939) and also how it was later adapted by Kreisel (1950) and Wang (1955) in order to obtain formal undecidability results. A generalization of this method will then be presented whereby Russell’s paradox, a variant of Mirimanoff’s paradox, the Liar, and the Grelling–Nelson paradox may be uniformly transformed into incompleteness theorems. Some additional observations are then framed relating these results to the unification of the set theoretic and semantic paradoxes, the intensionality of arithmetization (in the sense of Feferman, 1960), and axiomatic theories of truth.
APA, Harvard, Vancouver, ISO, and other styles
45

Ruthrof, Horst. "On the Inscrutability of Logic in Certain Natural Language Contexts." Public Journal of Semiotics 4, no. 2 (February 1, 2013): 104–21. http://dx.doi.org/10.37693/pjos.2013.4.8844.

Full text
Abstract:
The paper opens by defining 'logical universality' as the retention of the propositional content of expressions under any enunciative circumstances. Universality in this sense, the paper claims, cannot be demonstrated in the same manner across different discursive domains and sign systems. Unlike in geometry, arithmetic, algebraic and mathematical logic, where logical universality can be shown to be non-controversial, the concept of universality becomes problematic as soon as natural language terms and syntax are employed. The paper shows the main reasons for this difficulty to lie in the extensional features of natural language, which cannot be adequately captured by intentional means. Intentional descriptions are claimed to apply only to semiotically homogeneous sign systems of a formal kind. Natural language expressions, in contrast, are semiotically heterogeneous, or heterosemiotic, characterised as they are by quasi-perceptual ingredients. Nevertheless, the paper argues, there are three cases in which logical universality can be demonstrated to hold in spite of natural language being employed, one of which is strictly technical language. In contrast, culturally fully saturated natural language use is shown to escape the constraints of logical universality as defined, on the grounds that some of its essential features, such as referential background, reference, and deixis, especially in its implicit form, effectively undermine the retention of identical propositional contents across cultures and time.
APA, Harvard, Vancouver, ISO, and other styles
46

Power, A. J., and Charles Wells. "A formalism for the specification of essentially-algebraic structures in 2-categories." Mathematical Structures in Computer Science 2, no. 1 (March 1992): 1–28. http://dx.doi.org/10.1017/s0960129500001110.

Full text
Abstract:
A type of higher-order two-dimensional sketch is defined which has models in suitable 2-categories. It has as special cases the ordinary sketches of Ehresmann and certain previously defined generalizations of one-dimensional sketches. These sketches allow the specification of constructions in 2-categories such as weighted limits, as well as higher-order constructions such as exponential objects and subobject classifiers, that cannot be sketched by limits and colimits. These sketches are designed to be the basis of a category-based methodology for the description of functional programming languages, complete with rewrite rules giving the operational semantics, that is independent of the usual specification methods based on formal languages and symbolic logic. A definition of ‘path grammar’, generalizing the usual notion of grammar, is given as a step towards this goal.
APA, Harvard, Vancouver, ISO, and other styles
47

Erbsen, Andres, Jade Philipoom, Dustin Jamner, Ashley Lin, Samuel Gruetter, Clément Pit-Claudel, and Adam Chlipala. "Foundational Integration Verification of a Cryptographic Server." Proceedings of the ACM on Programming Languages 8, PLDI (June 20, 2024): 1704–29. http://dx.doi.org/10.1145/3656446.

Full text
Abstract:
We present verification of a bare-metal server built using diverse implementation techniques and languages against a whole-system input-output specification in terms of machine code, network packets, and mathematical specifications of elliptic-curve cryptography. We used very different formal-reasoning techniques throughout the stack, ranging from computer algebra, symbolic execution, and verification-condition generation to interactive verification of functional programs including compilers for C-like and functional languages. All these component specifications and domain-specific reasoning techniques are defined and justified against common foundations in the Coq proof assistant. Connecting these components is a minimalistic specification style based on functional programs and assertions over simple objects, omnisemantics for program execution, and basic separation logic for memory layout. This design enables us to bring the components together in a top-level correctness theorem that can be audited without understanding or trusting the internal interfaces and tools. Our case study is a simple cryptographic server for flipping of a bit of state through public-key authenticated network messages, and its proof shows total functional correctness including static bounds on memory usage. This paper also describes our experiences with the specific verification tools we build upon, along with detailed analysis of reasons behind the widely varying levels of productivity we experienced between combinations of tools and tasks.
APA, Harvard, Vancouver, ISO, and other styles
48

Dolgorukov, Vitaly V., and Vera A. Shumilina. "What Is Formal Philosophy?" Epistemology & Philosophy of Science 58, no. 1 (2021): 235–41. http://dx.doi.org/10.5840/eps202158120.

Full text
Abstract:
The paper focuses on the review of current literature on formal philosophy. Special attention is paid to the review of the book «Introduction to Formal Philosophy» [Hansson, Hendricks, 2018]. The book is a consistent introduction to the problems of formal philosophy, a research tradition that relies on the precise mathematical tools in order to study traditional philosophical problems. The methods of formal philosophy are successfully applied not only to the problems of ontology, epistemology and philosophy of language but also relevant for the problems of ethics, axiology and social philosophy. The book demonstrates that it is not correct to identify formal philosophy with another area of study – philosophical logic, since formal philosophy uses not only logical methods of analysis, but also uses the tools of game theory, decision theory, probability theory, Bayesian statistics, and other theories. Although the book has a propaedeutic character, it also contains some open problems. These problems include the aggregation of the opinions of the group under the condition of a conflicting base of premises in the theory of public choice, there are still open problems in the interpretation of Arrow’s impossibility theorem and others. Certainly, formalization in itself is not a general solution to the particular philosophical problem, but only a tool that allows to formulate a problem in a more rigorous and precise way, which sometimes allows to reveal some unexpected consequences, some implicit contradictions and new solutions. Despite the importance of the concept of coherence in ethics, decision theory, philosophy of law, Bayesian epistemology, philosophy of science, the existing formalizations of the concept of coherence are highly specialized for epistemology, researchers recognize the lack of the relevant explanatory models. Overall, the book is an excellent introduction in to the field of formal philosophy, which provides a general overview of different aspects of formal philosophy and the opportunity to study particular research topics by means of an extensive bibliography accompanying each of the chapters.
APA, Harvard, Vancouver, ISO, and other styles
49

INCLEZAN, DANIELA, and MICHAEL GELFOND. "Modular action language." Theory and Practice of Logic Programming 16, no. 2 (July 6, 2015): 189–235. http://dx.doi.org/10.1017/s1471068415000095.

Full text
Abstract:
AbstractThe paper introduces a new modular action language,${\mathcal ALM}$, and illustrates the methodology of its use. It is based on the approach of Gelfond and Lifschitz (1993,Journal of Logic Programming 17, 2–4, 301–321; 1998,Electronic Transactions on AI 3, 16, 193–210) in which a high-level action language is used as a front end for a logic programming system description. The resulting logic programming representation is used to perform various computational tasks. The methodology based on existing action languages works well for small and even medium size systems, but is not meant to deal with larger systems that requirestructuring of knowledge.$\mathcal{ALM}$is meant to remedy this problem. Structuring of knowledge in${\mathcal ALM}$is supported by the concepts ofmodule(a formal description of a specific piece of knowledge packaged as a unit),module hierarchy, andlibrary, and by the division of a system description of${\mathcal ALM}$into two parts:theoryandstructure. Atheoryconsists of one or more modules with a common theme, possibly organized into a module hierarchy based on adependency relation. It contains declarations of sorts, attributes, and properties of the domain together with axioms describing them.Structuresare used to describe the domain's objects. These features, together with the means for defining classes of a domain as special cases of previously defined ones, facilitate the stepwise development, testing, and readability of a knowledge base, as well as the creation of knowledge representation libraries.
APA, Harvard, Vancouver, ISO, and other styles
50

Glushkova, Todorka, Vanya Ivanova, and Boyan Zlatanov. "Beyond Traditional Assessment: A Fuzzy Logic-Infused Hybrid Approach to Equitable Proficiency Evaluation via Online Practice Tests." Mathematics 12, no. 3 (January 24, 2024): 371. http://dx.doi.org/10.3390/math12030371.

Full text
Abstract:
This article presents a hybrid approach to assessing students’ foreign language proficiency in a cyber–physical educational environment. It focuses on the advantages of the integrated assessment of student knowledge by considering the impact of automatic assessment, learners’ independent work, and their achievements to date. An assessment approach is described using the mathematical theory of fuzzy functions, which are employed to ensure the fair evaluation of students. The largest possible number of students whose reevaluation of test results will not affect the overall performance of the student group is automatically determined. The study also models the assessment process in the cyber–physical educational environment through the formal semantics of calculus of context-aware ambients (CCAs).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography