To see the other types of publications on this topic, follow the link: Semantic theory ; syntax ; natural language.

Journal articles on the topic 'Semantic theory ; syntax ; natural language'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Semantic theory ; syntax ; natural language.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

GAO, XIAOYU, HU YUE, L. LI, and QINGSHI GAO. "SEMANTIC-PARSING BASED ON SEMANTIC UNITS THEORY — A NEW APPROACH TO NATURAL LANGUAGES PROCESSING." International Journal of Pattern Recognition and Artificial Intelligence 22, no. 07 (November 2008): 1447–59. http://dx.doi.org/10.1142/s0218001408006818.

Full text
Abstract:
The syntax of different natural languages are different, hence the parsing of different natural languages are also different, thus leadings to structures of their parsing-trees being different. The reason that the sentences in different natural languages can be translated to each other is that they have the same meaning. This paper discusses a new sentence parsing, called semantic-parsing, based on semantic units theory. It is a new theory where a sentence of a natural language is not regarded as of words and phrases arranged linearly; rather it is expected to consist of semantic units with or without type-parameters. This is a new parsing approach where the syntax-parsing-tree and semantic-parsing-tree are isomorphic. It is also a new approach in which the structure-trees of the sentences in all different natural languages can correspond.
APA, Harvard, Vancouver, ISO, and other styles
2

Al-Janabi, Adel, Ehsan Ali Kareem, and Radhwan Hussein Abdulzhraa Al Sagheer. "Encapsulation of semantic description with syntactic components for the Arabic language." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 2 (May 1, 2021): 961. http://dx.doi.org/10.11591/ijeecs.v22.i2.pp961-967.

Full text
Abstract:
<span>The work presents new theoretical equipment for the representation of natural languages (NL) in computers. Linguistics: morphology, semantics, and syntax are also presented as components of subtle computer science that form. A structure and an integrated data system. The presented useful theory of language is a new method to learn the language by separating the fields of semantics and syntax.</span>
APA, Harvard, Vancouver, ISO, and other styles
3

Boleda, Gemma. "Distributional Semantics and Linguistic Theory." Annual Review of Linguistics 6, no. 1 (January 14, 2020): 213–34. http://dx.doi.org/10.1146/annurev-linguistics-011619-030303.

Full text
Abstract:
Distributional semantics provides multidimensional, graded, empirically induced word representations that successfully capture many aspects of meaning in natural languages, as shown by a large body of research in computational linguistics; yet, its impact in theoretical linguistics has so far been limited. This review provides a critical discussion of the literature on distributional semantics, with an emphasis on methods and results that are relevant for theoretical linguistics, in three areas: semantic change, polysemy and composition, and the grammar–semantics interface (specifically, the interface of semantics with syntax and with derivational morphology). The goal of this review is to foster greater cross-fertilization of theoretical and computational approaches to language as a means to advance our collective knowledge of how it works.
APA, Harvard, Vancouver, ISO, and other styles
4

Ana, I. Wayan. "MAKNA “MEMANCING” BAHASA BALI DIALEK DESA LEMBONGAN: KAJIAN METABAHASA SEMANTIK ALAMI." KULTURISTIK: Jurnal Bahasa dan Budaya 1, no. 1 (July 7, 2017): 12. http://dx.doi.org/10.22225/kulturistik.1.1.213.

Full text
Abstract:
[Title: Meaning of “Memancing”Balinese Language of Lembongan Village Dialect : The Study of Natural Semantics Metalanguage] The word ‘memancing” (fishing) can be found in most languages in the world and this word is unique in the Balinese Language of Lembongan Village Dialect as it various hyponymies. This research is intended to recognize and explain the meaning of “memancing” in Balinese of Lembongan Village Dialect and to preserve the lexicons that are almost extinct due to the rapid development of tourism. To analyze the problems of this research, the NSM theory is applied to the concept of semantic primitive, polysemy, allolection, and universal syntax. The phases of NSM theory in analyzing the meaning of “memancing” in BBDL are configurations of components to find out distinctive features, and explication to study the information on the meaning of “memancing” which includes entity imposed by the treatment, the equipment use, the uniqueness of movement and the expected proceeds. From the analysis, it is found that there are seven lexicons having the meaning of “memancing” with distinctive features, namely: memelas, ngerumik, ngulur, melok, muduk, maidang dan nyogonang.
APA, Harvard, Vancouver, ISO, and other styles
5

SHUANG-YUN, YAO, WANG YU-HONG, and SHEN WEI. "THEORY OF PIVOTAL CLAUSE AND CHINESE LANGUAGE PROCESSING." New Mathematics and Natural Computation 09, no. 02 (July 2013): 207–35. http://dx.doi.org/10.1142/s1793005713400048.

Full text
Abstract:
This paper reports a study on the application of the theory of pivotal clause in chinese language processing. Three fundamental characteristics of Chinese grammar are briefly introduced: (i) grammatical simplification; (ii) grammatical compatibility; (iii) full exploitation of syntactic position. These characteristics may pose a great many difficulties for Chinese language processing. Based on these features, this paper argues that the theory of pivotal clause would be the best to study natural language processing taking into account the six influential theories in modern Chinese linguistics, for clause as the center of all grammatical units can both make up of sentence groups and discourse and closely relates to morphology as well as syntax. As an important theory to illustrate Chinese grammar, the theory of pivotal clause is not only in line with the characteristics of theme-oriented Chinese language but also in line with the international trend in linguistics, and it is of significance for the linguistic study of Chinese as well as for Chinese language processing. This paper also exhibits the specific application of the theory of pivotal clause in Chinese language processing based on two case studies, one of which is the segmentation and tagging of Chinese language, the other is automatic syntactic and semantic analysis of discourse coherence.
APA, Harvard, Vancouver, ISO, and other styles
6

Kotek, Hadas. "Dissociating intervention effects from superiority in English wh-questions." Linguistic Review 34, no. 2 (October 26, 2017): 397–417. http://dx.doi.org/10.1515/tlr-2017-0005.

Full text
Abstract:
Abstract In wh-questions, intervention effects are detected whenever certain elements – focus-sensitive operators, negative elements, and quantifiers – c-command an in-situ wh-word. Pesetsky (2000, Phrasal movement and its kin. Cambridge, MA: MIT Press) presents a comprehensive study of intervention effects in English multiple wh-questions, arguing that intervention correlates with superiority: superiority-violating questions are subject to intervention effects, while superiority-obeying questions are immune from such effects. This description has been adopted as an explanandum in most recent work on intervention, such as Beck (2006, Intervention effects follow from focus interpretation. Natural Language Semantics 14. 1–56) and Cable (2010, The Grammar of Q: Q-particles, wh-movement, and pied-piping. Oxford University Press), a.o. In this paper, I show instead that intervention effects in English questions correlate with the available LF positions for wh-in-situ and the intervener, but not with superiority. The grammar allows for several different ways of repairing intervention configurations, including wh-movement, scrambling, Quantifier Raising, and reconstruction. Intervention effects are observed when none of these repair strategies are applicable, and there is no way of avoiding the intervention configuration – regardless of superiority. Nonetheless, I show that these results are consistent with the syntax proposed for English questions in Pesetsky (2000, Phrasal movement and its kin. Cambridge, MA: MIT Press) and with the semantic theory of intervention effects in Beck (2006, Intervention effects follow from focus interpretation. Natural Language Semantics 14. 1–56).
APA, Harvard, Vancouver, ISO, and other styles
7

GRINSTEAD, John. "Interface Delay." Journal of Child Language 48, no. 5 (July 13, 2021): 888–906. http://dx.doi.org/10.1017/s0305000921000477.

Full text
Abstract:
AbstractInterface Delay is a theory of syntactic development, which attempts to explain an array of constructions that are slow to develop, which are characterized by being sensitive to discourse-pragmatic considerations of the type associated with the natural semantic class of definites. The theory claims that neither syntax itself, nor the discourse-pragmatic abilities related to executive function and theory of mind themselves are slow to develop. Rather, the claim is that the nexus or interface between the two cognitive domains is slow to develop. We review the development of subjects in child Spanish as an example of this delayed growth trajectory. Further, we review evidence that a delay in the development of tense causes concomitant delays in the seemingly unrelated phenomena of non-nominative case subject pronoun use and un-inverted wh- questions.
APA, Harvard, Vancouver, ISO, and other styles
8

Moschitti, Alessandro, Daniele Pighin, and Roberto Basili. "Tree Kernels for Semantic Role Labeling." Computational Linguistics 34, no. 2 (June 2008): 193–224. http://dx.doi.org/10.1162/coli.2008.34.2.193.

Full text
Abstract:
The availability of large scale data sets of manually annotated predicate-argument structures has recently favored the use of machine learning approaches to the design of automated semantic role labeling (SRL) systems. The main research in this area relates to the design choices for feature representation and for effective decompositions of the task in different learning models. Regarding the former choice, structural properties of full syntactic parses are largely employed as they represent ways to encode different principles suggested by the linking theory between syntax and semantics. The latter choice relates to several learning schemes over global views of the parses. For example, re-ranking stages operating over alternative predicate-argument sequences of the same sentence have shown to be very effective. In this article, we propose several kernel functions to model parse tree properties in kernel-based machines, for example, perceptrons or support vector machines. In particular, we define different kinds of tree kernels as general approaches to feature engineering in SRL. Moreover, we extensively experiment with such kernels to investigate their contribution to individual stages of an SRL architecture both in isolation and in combination with other traditional manually coded features. The results for boundary recognition, classification, and re-ranking stages provide systematic evidence about the significant impact of tree kernels on the overall accuracy, especially when the amount of training data is small. As a conclusive result, tree kernels allow for a general and easily portable feature engineering method which is applicable to a large family of natural language processing tasks.
APA, Harvard, Vancouver, ISO, and other styles
9

McCawley, James D. "The biological side of Otto Jespersen’s linguistic thought." Historiographia Linguistica 19, no. 1 (January 1, 1992): 97–110. http://dx.doi.org/10.1075/hl.19.1.06mcc.

Full text
Abstract:
Summary Central ideas of Darwin’s theory of natural selection figure prominently in the work of Otto Jespersen (1860–1943). As early as 1886, Jespersen treated linguistic change in Darwinian terms: variation in the pronunciation and meaning of the various units, and factors that raise or lower a variety’s viability. His critique of the neogrammarian principle of exceptionlessness of sound change includes the point that phonologically parallel words often differ in the relative viability of their variants. By 1904, Jespersen was using ‘functional load’ in explaining differences in how much resistance each language offers to various natural phonetic tendencies. He argued that conformity to a sound-symbolic generalization raises a form’s viability and can thus exempt some words from sound changes and accelerate the adoption of novel words and of novel meanings for existing words. Natural selection figures even in Jespersen’s papers on international auxiliary languages, as in his account of why bil, the winner in a contest for a word for ‘automobile’, spread so rapidly in Scandinavia. Jespersen’s speculative scenario for language origins is in terms of Darwinian ‘preadaptation’: conventionalized sound/meaning correspondences can arise in numerous ways prior to the development of anything like a language (Jespersen argues that singing, in all its diverse social functions, developed prior to language), and a language would develop not ex nihilo but by members of a human community segmenting and imposing arbitrary semantic analyses on some of this large body of meaningful sound, and starting to combine the pieces in novel ways, as modern children do anyway (Jespersen argues) in the course of acquiring a language. Jespersen thereby vindicated his unpopular conclusion that early human languages had highly irregular morphology and syntax.
APA, Harvard, Vancouver, ISO, and other styles
10

Jackendoff, Ray. "Précis of Foundations of Language: Brain, Meaning, Grammar, Evolution,." Behavioral and Brain Sciences 26, no. 6 (December 2003): 651–65. http://dx.doi.org/10.1017/s0140525x03000153.

Full text
Abstract:
The goal of this study is to reintegrate the theory of generative grammar into the cognitive sciences. Generative grammar was right to focus on the child's acquisition of language as its central problem, leading to the hypothesis of an innate Universal Grammar. However, generative grammar was mistaken in assuming that the syntactic component is the sole course of combinatoriality, and that everything else is “interpretive.” The proper approach is a parallel architecture, in which phonology, syntax, and semantics are autonomous generative systems linked by interface components. The parallel architecture leads to an integration within linguistics, and to a far better integration with the rest of cognitive neuroscience. It fits naturally into the larger architecture of the mind/brain and permits a properly mentalistic theory of semantics. It results in a view of linguistic performance in which the rules of grammar are directly involved in processing. Finally, it leads to a natural account of the incremental evolution of the language capacity.
APA, Harvard, Vancouver, ISO, and other styles
11

Gabbay, Murdoch J. "Foundations of Nominal Techniques: Logic and Semantics of Variables in Abstract Syntax." Bulletin of Symbolic Logic 17, no. 2 (June 2011): 161–229. http://dx.doi.org/10.2178/bsl/1305810911.

Full text
Abstract:
AbstractWe are used to the idea that computers operate on numbers, yet another kind of data is equally important: the syntax of formal languages, with variables, binding, and alpha-equivalence. The original application of nominal techniques, and the one with greatest prominence in this paper, is to reasoning on formal syntax with variables and binding.Variables can be modelled in many ways: for instance as numbers (since we usually take countably many of them); as links (since they may ‘point’ to a binding site in the term, where they are bound); or as functions (since they often, though not always, represent ‘an unknown’).None of these models is perfect. In every case for the models above, problems arise when trying to use them as a basis for a fully formal mechanical treatment of formal language. The problems are practical—but their underlying cause may be mathematical.The issue is not whether formal syntax exists, since clearly it does, so much as what kind of mathematical structure it is. To illustrate this point by a parody, logical derivations can be modelled using a Gödel encoding (i.e., injected into the natural numbers). It would be false to conclude from this that proof-theory is a branch of number theory and can be understood in terms of, say, Peano's axioms. Similarly, as it turns out, it is false to conclude from the fact that variables can be encoded e.g., as numbers, that the theory of syntax-with-binding can be understood in terms of the theory of syntax-without-binding, plus the theory of numbers (or, taking this to a logical extreme, purely in terms of the theory of numbers). It cannot; something else is going on. What that something else is, has not yet been fully understood.In nominal techniques, variables are an instance of names, and names are data. We model names using urelemente with properties that, pleasingly enough, turn out to have been investigated by Fraenkel and Mostowski in the first half of the 20th century for a completely different purpose than modelling formal language. What makes this model really interesting is that it gives names distinctive properties which can be related to useful logic and programming principles for formal syntax.Since the initial publications, advances in the mathematics and presentation have been introduced piecemeal in the literature. This paper provides in a single accessible document an updated development of the foundations of nominal techniques. This gives the reader easy access to updated results and new proofs which they would otherwise have to search across two or more papers to find, and full proofs that in other publications may have been elided. We also include some new material not appearing elsewhere.
APA, Harvard, Vancouver, ISO, and other styles
12

Ran, Xiaohui. "Question Understanding from the Perspective of Context Theory." Journal of Research in Philosophy and History 4, no. 2 (June 18, 2021): p28. http://dx.doi.org/10.22158/jrph.v4n2p28.

Full text
Abstract:
This paper uses context theory to study the question in natural language. In syntax, questions can be classified into polar questions, alternative questions, concealed questions, and inquisitive questions. In semantics, it can be divided into polar questions and inquisitive questions. Only inquisitive questions with characteristics of inquisitiveness, informativeness, compliance, and transparency need to be studied by context theory. There are three levels for question context: question-answer facts, background knowledge, and question presupposition. The question context composes the possible world where the question is. Question understanding is a function of the mapping of the question through the possible worlds, and the set of propositions consisting of different possible worlds of the question context and the set of propositions consisting of different possible answers to the question are mapped to each other, resulting in different answers in different possible worlds of the same question.
APA, Harvard, Vancouver, ISO, and other styles
13

Pineda, Luis, and Gabriela Garza. "A Model for Multimodal Reference Resolution." Computational Linguistics 26, no. 2 (June 2000): 139–93. http://dx.doi.org/10.1162/089120100561665.

Full text
Abstract:
An important aspect of the interpretation of multimodal messages is the ability to identify when the same object in the world is the referent of symbols in different modalities. To understand the caption of a picture, for instance, one needs to identify the graphical symbols that are referred to by names and pronouns in the natural language text. One way to think of this problem is in terms of the notion of anaphora; however, unlike linguistic anaphoric inference, in which antecedents for pronouns are selected from a linguistic context, in the interpretation of the textual part of multimodal messages the antecedents are selected from a graphical context. Under this view, resolving multimodal references is like resolving anaphora across modalities. Another way to see the same problem is to look at pronouns in texts about drawings as deictic. In this second view, the context of interpretation of a natural language term is defined as a set of expressions of a graphical language with well-defined syntax and semantics. Natural language and graphical terms are thought of as standing in a relation of translation similar to the translation relation that holds between natural languages. In this paper a theory based on this second view is presented. In this theory, the relations between multimodal representation and spatial deixis, on the one hand, and multimodal reasoning and deictic inference, on the other, are discussed. An integrated model of anaphoric and deictic resolution in the context of the interpretation of multimodal discourse is also advanced.
APA, Harvard, Vancouver, ISO, and other styles
14

Filimon, Rosina Caterina. "Decoding the Musical Message via the Structural Analogy between Verbal and Musical Language." Artes. Journal of Musicology 18, no. 1 (March 1, 2018): 151–60. http://dx.doi.org/10.2478/ajm-2018-0009.

Full text
Abstract:
Abstract The topic approached in this paper aims to identify the structural similarities between the verbal and the musical language and to highlight the process of decoding the musical message through the structural analogy between them. The process of musical perception and musical decoding involves physiological, psychological and aesthetic phenomena. Besides receiving the sound waves, it implies complex cognitive processes being activated, whose aim is to decode the musical material at cerebral level. Starting from the research methods in cognitive psychology, music researchers redefine the process of musical perception in a series of papers in musical cognitive psychology. In the case of the analogy between language and music, deciphering the musical structure and its perception are due, according to researchers, to several common structural configurations. A significant model for the description of the musical structure is Noam Chomsky’s generative-transformational model. This claimed that, at a deep level, all languages have the same syntactic structure, on account of innate anatomical and physiological structures which became specialized as a consequence of the universal nature of certain mechanisms of the human intellect. Chomsky’s studies supported by sophisticated experimental devices, computerised analyses and algorithmic models have identified the syntax of the musical message, as well as the rules and principles that underlie the processing of sound-related information by the listener; this syntax, principles and rules show surprising similarities with the verbal language. The musicologist Heinrich Schenker, 20 years ahead of Chomsky, considers that there is a parallel between the analysis of natural language and that of the musical structure, and has developed his own theory on the structure of music. Schenker’s structural analysis is based on the idea that tonal music is organized hierarchically, in a layering of structural levels. Thus, spoken language and music are governed by common rules: phonology, syntax and semantics. Fred Lerdahl and Ray Jackendoff develop a musical grammar where a set of generating rules are defined to explain the hierarchical structure of tonal music. The authors of the generative theory propose the hypothesis of a musical grammar based on two types of rules, which take into account the conscious and unconscious principles that govern the organization of the musical perception. The structural analogy between verbal and musical language consists of several common elements. Among those is the hierarchical organization of both fields, a governance by the same rules – phonology, syntax, semantics – and as a consequence of the universal nature of certain mechanisms of the human intellect, decoding the transmitted message is accomplished thanks to some universal innate structures, biologically inherited. Also, according to Chomsky's linguistics model a musical grammar is configured, one governed by wellformed rules and preference rules. Thus, a musical piece is not perceived as a stream of disordered sounds, but it is deconstructed, developed and assimilated at cerebral level by means of cognitive pre-existing schemes.
APA, Harvard, Vancouver, ISO, and other styles
15

Kramsch, Claire. "A New Field of Research: SLA-Applied Linguistics." PMLA/Publications of the Modern Language Association of America 115, no. 7 (December 2000): 1978–81. http://dx.doi.org/10.2307/463621.

Full text
Abstract:
Second language acquisition research (sla) is the systematic exploration of the conditions that make the acquisition of a foreign language possible, both in natural and in instructional settings. Its objects of study are the biological, linguistic, psychological, and emotional makeup of language learners and the educational, social, and institutional context of learning and teaching. Whereas language as a linguistic system is studied through the metalanguage of linguistics (phonology, syntax, and semantics), language learning, as psycholinguistic process and sociolinguistic discourse, is researched through the metadiscourse of applied linguistics: psycho- and sociolinguistics, anthropological and educational linguistics, discourse analysis, pragmatics, stylistics, and composition and literacy studies. These fields illuminate what it means to learn to speak, read, write, and interact in a foreign language, what it means to appropriate for oneself the national idiom of communities that share a history and a culture that are different from one's own. SLA provides the applied linguistic metadiscourse for the practice of language learning and teaching.
APA, Harvard, Vancouver, ISO, and other styles
16

Pervukhina, Svetlana Vladimirovna, Gyulnara Vladimirovna Basenko, Irina Gennadjevna Ryabtseva, and Elena Evgenyevna Sakharova. "Approaches to Text Simplification: Can Computer Technologies Outdo a Human Mind?" GEMA Online® Journal of Language Studies 21, no. 3 (August 30, 2021): 37–51. http://dx.doi.org/10.17576/gema-2021-2103-03.

Full text
Abstract:
Narrowly specialized information is addressed to a limited circle of professionals though it provokes interest among people without specialized education. This gives rise to a need for the popularization of scientific information. This process is carried out through simplified texts as a kind of secondary texts that are directly aimed at the addressee. Age, language proficiency and background knowledge are the main features which are usually taken into consideration by the author of the secondary text who makes changes in the text composition, as well as in its pragmatics, semantics and syntax. This article analyses traditional approaches to text simplification, computer simplification and summarization. The authors compare human-authored simplification of literary texts with the newest trends in computer simplification to promote further development of machine simplification tools. It has been found that the samples of simplified scientific texts seem to be more natural than the samples of simplified literary texts since technical background knowledge can be processed with machine tools. The authors have come to the conclusion that literary and technical texts should imply different approaches for adaptation and simplification. In addition, personal readers’ experience plays a great part in finding the implications in literary texts. In this respect it might be reasonable to create separate engines for simplifying and adapting texts from diverse spheres of knowledge. Keywords Text Simplification; Natural Language Processing (NLP); Pragmatic Adaptation; Professional Communication; Literary Texts
APA, Harvard, Vancouver, ISO, and other styles
17

Montrul, Silvina. "INTRODUCTION." Studies in Second Language Acquisition 23, no. 2 (June 2001): 145–51. http://dx.doi.org/10.1017/s0272263101002017.

Full text
Abstract:
Due to the recognition of the centrality of the lexicon for SLA theory (see the 1987 thematic issue of SSLA, edited by Susan Gass), the last few years have witnessed an increased interest in understanding lexical knowledge. As Gass (1999) reminded us, learning vocabulary in a second language is a complex task that involves much more than learning sound-meaning pairings; it also involves learning how lexical information is morphologically expressed and syntactically constrained. The present issue provides a natural sequel to the 1999 SSLA thematic issue, “Incidental L2 Vocabulary Acquisition,” by addressing some of the questions raised in that volume, in particular the questions related to the intimate relationship between syntax and semantics at the lexical interface. This issue is devoted to the L2 acquisition of verb meaning and argument structure crosslinguistically, and it explores in detail the nature of linguistic systems that L2 learners acquire in this particular domain. The six central articles offer a coherent approach to the topic, using linguistic theory to help us understand the characteristics of learner grammars. Until recently, linguistic approaches to SLA have placed a strong emphasis on understanding the acquisition of functional categories, for example, and the acquisition of the lexicon has received less attention. Understanding how the lexico-syntactic interface is mentally represented, and how it evolves during the second language acquisition process, is crucial for developing an adequate theory of L2 knowledge in general, as well as for informing theories of the lexicon.
APA, Harvard, Vancouver, ISO, and other styles
18

Джарбо Сaмер Омар. "The Semantics-Pragmatics Interface: The Case of the Singular Feminine Demonstrative in Jordanian Arabic." East European Journal of Psycholinguistics 4, no. 1 (June 27, 2017): 63–75. http://dx.doi.org/10.29038/eejpl.2017.4.1.jar.

Full text
Abstract:
The aim in this study is to investigate the interface between semantics and pragmatics in relation to the use of the indexical demonstrative ‘haay’ ‘this-S.F.’ in Jordanian Arabic (JA). It is argued here that an analysis of meaning in relation to context-sensitivity inherent in the use of ‘haay’ can give evidence to the view that semantic and pragmatic processes can be distinguished from each other. I have found that the meaning of ‘haay’ consists of three distinct levels: linguistic, semantic, and pragmatic meaning. The denotational and conventional senses of ‘haay’ comprise its linguistic meaning, its semantic meaning is generated when any of the variables in the linguistic meaning is selected in relation to 'narrow context', the pragmatic meaning depends on relating the semantic meaning to an entity in the physical context of interaction. The results of this study support the view that the boundary between semantics and pragmatics can be distinctively demarcated. References Agha, A. (1996). Schema and superposition in spatial deixis. Anthropological Linguistics,38(4), 643–682. Ariel, M. (2002). The demise of a unique concept of literal meaning. Journal ofPragmatics, 34(4), 361–402. Bach, K. (1994). Conversational impliciture. Mind and Language, 9(2), 124–162. Bach, K. (1997). The semantics-pragmatics distinction: What it is and why it matters,Linguistiche Berichte, 8, 33–50. Bach, K. (2001). You don’t say? Synthese, 128(1), 15–44. Bach, K. (2012). Context dependence. In: The Continuum Companion to the Philosophy ofLanguage, (pp. 153–184). M. García-Carpintero & M. Kölbel (eds.). New York:Continuum International. Bartsch, R. (1996). The myth of literal meaning. In: Language Structure and LanguageUse: Proceedings of the International Conference on Lexicology and Lexical Semantics.Munster, 1994, (pp. 3–16). E. Weigand and F. Hundsnurscher (eds.). Tubingen: Niemeyer:. Berg, J. (2002). Is semantics still possible? Journal of Pragmatics, 34(4), 349–59. Braun, D. (2008). Complex demonstratives and their singular contents. Linguisticsand Philosophy, 31(1), 57–99. Cappelen, H. & Lepore, E. (2005). Insensitive Semantics: A Defense of SemanticMinimalism and Speech Act Pluralism. Oxford: Blackwell Carston, R. (2008). Linguistic communication and the semantics-pragmatics distinction.Synthese, 165(3), 321–345. Clark, H. (1996). Using Language. Cambridge: Cambridge University Press. Dascal, M. (1987). Defending Literal Meaning. Cognitive Science, 11(3), 259–281. Doerge, C. F. (2010). The collapse of insensitive semantics. Linguistics and Philosophy,33(2), 117–140. Gazdar, G. (1979). Pragmatics: Implicature, Presupposition, and Logical Form. NewYork: Academic Press. Gibbs, R. W. (1984). Literal meaning and psychological theory. Cognitive Science, 8(3),275–304. Gibbs, R. W. (1994). The Poetics of Mind. Cambridge: Cambridge University Press. Gibbs, R.W. (1999). Speakers’ intuitions and pragmatic theory. Cognition, 69(3), 355–359. Gibbs, R. W. & Moise, J. F. (1997). Pragmatics in understanding what is said. Cognition,62(1), 51–74. Giora, R., (1997). Understanding figurative and literal language: the graded saliencehypothesis. Cognitive Linguistics, 8(3), 183–206. Giora, R. (1999). On the priority of salient meanings: studies of literal and figurativelanguage. Journal of Pragmatics, 31(7), 919–929. Giora, R. (2002). Literal vs. figurative language: different or equal? Journal ofPragmatics, 34(4), 487–506. Grice, H.P. (1978). Further notes on logic and conversation. In: Syntax and Semantics, 9,P. Cole (ed.). (pp.113–127). New York: Academic Press; reprinted in H.P. Grice (1989).Studies in the Way of Words. Cambridge, MA: Harvard University Press. Hanks, W. (1990). Referential practice: Language and lived space among the Maya.Chicago: The University of Chicago Press. Huang, Y. (2007). Pragmatics. Oxford: Oxford University Press. Jarbou, S. O. (2012). Medial deictic demonstratives in Arabic: Fact or fallacy.Pragmatics, 22(1), 103–118. Kaplan, D. (1977). Demonstratives. In: Themes from Kaplan, J. Almog, J. Perry, andH. Wettstein (eds.). (pp. 481–563). New York: Oxford University Press. Katz, J. J. (1977). Propositional structure and Illocutionary Force. New York: ThomasY. Crowell. Kempson, R. (1988). Grammar and conversational principles. In: Linguistics,F. Newmeyer (ed.). The Cambridge Survey, Vol. II (pp. 139–163). Cambridge:Cambridge University Press. Lakoff, G. (1987). Women, Fire, and Dangerous Things: What Categories Reveal aboutthe Mind. Chicago: University of Chicago Press. Lee, C. J. (1990). Some hypotheses concerning the evolution of polysemous words.Journal of Psycholinguistic Research, 19, 211–219. Lepore, E., & Ludwig, K. (2000). The semantics and pragmatics of complexdemonstratives. Mind, 109(434), 199–240. Levinson, S.C. (1995). Three levels of Meaning. In: Grammar and meaning. Essays inHonour of Sir John Lyons, (pp. 90–115). F.R. Palmer (ed.). Cambridge: CambridgeUniversity Press. Levinson, S. C. (2006). Deixis and pragmatics. In: The Handbook of Pragmatics. (pp.97–121), L. Horn and G. Ward (eds.). Malden, MA: Blackwell Publishing. MacCormac, E. R. (1985). A Cognitive Theory of Metaphor. Cambridge, MA: MIT Press. Manning, P. (2001). On social deixis. Anthropological Linguistics, 43(1), 54–100. Nicolle, S. & Clark, B. (1999). Experimental pragmatics and what is said: a response toGibbs and Moise. Cognition, 69(3), 337–354. Recanati, F. (1989). The pragmatics of what is said. Mind and Language, 4(4), 295–329. Recanati, F. (1993). Direct Reference: From Language to Thought. Blackwell, Oxford. Recanati, F. (1995). The alleged priority of literal interpretation’. Cognitive Science, 19,207–232. Recanati, R. (2002). Unarticulated constituents. Linguistics and Philosophy, 25(3), 299–345. Recanati, F. (2004). Literal Mmeaning. Cambridge: Cambridge University Press. Rumelhart, D., E. (1979). Some problems with the notion of literal meaning. In:Metaphor and Thought. (pp. 78-90), A. Ortony (ed.). Cambridge: Cambridge UniversityPress. Searle, J. R., (1978). Literal meaning. Erkenntnis, 13(1), 207–224. Silverstein, M. (1976). Shifters, linguistic categories, and cultural description. In:Meaning in Anthropology. (pp. 11–56), K. Basso, & H.A. Selby (eds.). Albuquerque:School of American Research, University of New Mexico Press. Sperber, D. and Wilson D. (1986). Loose talk. Proceedings of the Aristotelian Society,86(1985-6), 153–171. Stalnaker, R. (1972). Pragmatics. In: Semantics for Natural Language. (pp. 380–97), D.Davidson and G. Harman (eds.). Dordrecht: Reidel. Stokke, A. (2010). Intention-sensitive semantics. Synthese 175, 383–404. Sweetser, E. (1990). From Etymology to Pragmatics. Cambridge: Cambridge UniversityPress. Vicente, B. (2002). What pragmatics can tell us about (literal) meaning: A critical note onKent Bach’s theory of impliciture. Journal of Pragmatics, 34(4), 403–421.
APA, Harvard, Vancouver, ISO, and other styles
19

Slabakova, Roumyana. "Semantic Theory and Second Language Acquisition." Annual Review of Applied Linguistics 30 (March 2010): 231–47. http://dx.doi.org/10.1017/s0267190510000139.

Full text
Abstract:
The article identifies four different types of meaning situated in different modules of language. Such a modular view of language architecture suggests that there may be differential difficulties of acquisition for the different modules. It is argued that second language (L2) acquisition of meaning involves acquiring interpretive mismatches at the first and second language (L1-L2) syntax-semantics interfaces. In acquiring meaning, learners face two types of learning situations. One situation where the sentence syntax presents less difficulty but different pieces of functional morphology subsume different primitives of meaning is dubbed simple syntax–complex semantics. Another type of learning situation is exemplified in less frequent, dispreferred, or syntactically complex sentences where the sentential semantics offers no mismatch; these are labeled complex syntax–simple semantics. Studies representative of these learning situations are reviewed. The issues of importance of explicit instruction with respect to interpretive properties and the effect of the native language are addressed. Studies looking at acquisition of language-specific discourse properties and universal pragmatics are also reviewed. These representative studies and numerous other studies on the L2 acquisition of meaning point to no visible barrier to ultimate success in the acquisition of semantics and pragmatics.
APA, Harvard, Vancouver, ISO, and other styles
20

Seuren, Pieter. "Essentials of Semantic Syntax." Cadernos de Linguística 2, no. 1 (January 28, 2021): 01–20. http://dx.doi.org/10.25189/2675-4916.2021.v2.n1.id290.

Full text
Abstract:
Semantic Syntax (SeSyn), originally called Generative Semantics, is an offshoot of Chomskyan generative grammar (ChoGG), rejected by Chomsky and his school in the late 1960s. SeSyn is the theory of algorithmical grammars producing the well-formed sentences of a language L from the corresponding semantic input, the Semantic Analysis (SA), represented as a traditional tree structure diagram in a specific formal language of incremental predicate logic with quantifying and qualifying operators (including the truth functions), and with all lexical items filled in. A SeSyn-type grammar is thus by definition transformational, but not generative. The SA originates in cognition in a manner that is still largely mysterious, but its actual form can be distilled from the Surface Structure (SS) of the sentences of L following the principles set out in SeSyn. In this presentation we provide a more or less technical résumé of the SeSyn theory. A comparison is made with ChoGG-type grammars, which are rejected on account of their intrinsic unsuitability as a cognitive-realist grammar model. The ChoGG model follows the pattern of a 1930s neopositivist Carnap-type grammar for formal logical languages. Such grammars are random sentence generators, whereas, obviously, (nonpathological) humans are not. A ChoGG-type grammar is fundamentally irreconcilable with a mentalist-realist theory of grammar. The body of the paper consists in a demonstration of the production of an English and a French sentence, the latter containing a classic instance of the cyclic rule of Predicate Raising (PR), essential in the general theory of clausal complementation yet steadfastly repudiated in ChoGG for reasons that have never been clarified. The processes and categories defined in SeSyn are effortlessly recognised in languages all over the world, whether indigenous or languages of a dominant culture—taking into account language-specific values for the general theoretical parameters involved. This property makes SeSyn particularly relevant for linguistic typology, which now ranks as the most promising branch of linguistics but has so far conspicuously lacked an adequate theoretical basis.
APA, Harvard, Vancouver, ISO, and other styles
21

Ernst, Thomas. "The Syntax of Adverbials." Annual Review of Linguistics 6, no. 1 (January 14, 2020): 89–109. http://dx.doi.org/10.1146/annurev-linguistics-011619-030334.

Full text
Abstract:
After explicit phrase structure rules were abandoned in government–binding theory, some account of the distribution of adverbials became necessary. This review surveys two current theories. The first, often called the scopal theory, posits that the main factor is semantics: In general, adverbials can appear wherever they cause no violation of semantic well-formedness. Purely syntactic and morphological factors play a role, but it is a relatively minor one. Though the scopal theory predicts a significant range of adverbial distribution correctly, much of its underlying semantic analysis remains to be developed in explicit terms. The second theory discussed in this review, the cartographic theory, takes syntax as central, proposing that adverbials are individually licensed by dedicated functional heads, arranged in a rigid hierarchy by Universal Grammar. This approach has some empirical successes but also a number of problems; thus, the scopal theory is more likely to represent the right direction.
APA, Harvard, Vancouver, ISO, and other styles
22

Hannan, John. "Extended natural semantics." Journal of Functional Programming 3, no. 2 (April 1993): 123–52. http://dx.doi.org/10.1017/s0956796800000666.

Full text
Abstract:
AbstractWe extend the definition of natural semantics to include simply typed λ-terms, instead of first-order terms, for representing programs, and to include inference rules for the introduction and discharge of hypotheses and eigenvariables. This extension, which we call extended natural semantics, affords a higher-level notion of abstract syntax for representing programs and suitable mechanisms for manipulating this syntax. We present several examples of semantic specifications for a simple functional programming language and demonstrate how we achieve simple and elegant manipulations of bound variables in functional programs. All the examples have been implemented and tested in λProlog, a higher-order logic programming language that supports all of the features of extended natural semantics.
APA, Harvard, Vancouver, ISO, and other styles
23

Tong, Jiahong, Zhigang Wu, Qi Liu, Liang Du, Liangde Xu, and Jingxing Yu. "Application of natural language understanding in Chinese power dispatching centre." E3S Web of Conferences 182 (2020): 02002. http://dx.doi.org/10.1051/e3sconf/202018202002.

Full text
Abstract:
It is difficult for computer to understand the texts in unstructured Chinese language, which becomes an obstacle for further application of artificial intelligence in the power dispatch center. Understanding of the orders from human dispatchers is the premise for the collaboration of machine and human being in power system operation. Towards understanding of dispatching texts, this paper proposes a textual semantic analysis framework with active learning of the semantic structure knowledge. Firstly, the words are vectorized by the Skip-gram models. And the hierarchical clustering algorithm is designed to detect the sentence patterns. Then the knowledge base is set up by converting the sentence structure to their regular expressions. In application, define a proprietary semantic framework to extract important device information and to parse the semantic slot using dependency syntax. Application shows that the Chinese texts describing the operation mode switching process can be understood accurately by the computer program.
APA, Harvard, Vancouver, ISO, and other styles
24

Li, Zhi Jiang, Jin Kai Li, and Wu Xia Ning. "Research on Chinese Natural Language Query Interface to Database Based on Syntax and Semantic." Applied Mechanics and Materials 731 (January 2015): 237–41. http://dx.doi.org/10.4028/www.scientific.net/amm.731.237.

Full text
Abstract:
The query of database must rely on professional query language, which is difficult to grasp for ordinary users. Natural language query interface (NLQI) can be applied to solve this problem with better human-computer interaction and intelligence. However, current natural language understanding technology is not mature. Though the query sentences’ structure is fixed, it is also difficult to accurately convert query sentences into understandable database query language for computer. Therefore a method of designing NLQI based on syntax and semantic analysis is proposed in this paper. First of all, Maximum forward and maximum reverse matching algorithms are used to analyse sentence by part of speech (POS) template, which is able to demonstrate the syntax relationship among the words. After syntax analysis, query conditions and targets are identified. Then, single conditions and targets are analyzed to relative standard forms by the relationship among entity, property, connect verb and unit. At last, these standard forms are transformed to SQL sentence. Regarding printing ERP system as experimental subject, experiment is done in this paper. The result shows that the method proposed in this paper can meet the needs of natural language query, and has a good transform effect for combination, omitting and other complex sentences.
APA, Harvard, Vancouver, ISO, and other styles
25

Lee, Ming Che, Jia Wei Chang, and Tung Cheng Hsieh. "A Grammar-Based Semantic Similarity Algorithm for Natural Language Sentences." Scientific World Journal 2014 (2014): 1–17. http://dx.doi.org/10.1155/2014/437162.

Full text
Abstract:
This paper presents a grammar and semantic corpus based similarity algorithm for natural language sentences. Natural language, in opposition to “artificial language”, such as computer programming languages, is the language used by the general public for daily communication. Traditional information retrieval approaches, such as vector models, LSA, HAL, or even the ontology-based approaches that extend to include concept similarity comparison instead of cooccurrence terms/words, may not always determine the perfect matching while there is no obvious relation or concept overlap between two natural language sentences. This paper proposes a sentence similarity algorithm that takes advantage of corpus-based ontology and grammatical rules to overcome the addressed problems. Experiments on two famous benchmarks demonstrate that the proposed algorithm has a significant performance improvement in sentences/short-texts with arbitrary syntax and structure.
APA, Harvard, Vancouver, ISO, and other styles
26

Makhmudova, S. M. "Formal Features of the Structural-Semantic Categories of One-Part Sentences in the Rutul Language." Язык и текст 7, no. 2 (2020): 91–101. http://dx.doi.org/10.17759/langt.2020070210.

Full text
Abstract:
While syntax attracts the attention of an increasing number of linguists in the world, syntactic theories are created based on the linguistic material of most natural languages, the syntax of Dagestani languages remains a kind of lacuna, not known even in a simple descriptive plan. One of the most poorly studied Islands in the Dagestani syntax is single - part sentences that have not been analyzed either in terms of composition or in terms of forms of expression. In the existing research on the syntax of Dagestani languages, single-part sentences are either not considered at all, or are considered very superficially. Meanwhile, single-part sentences are a fairly developed constructive class in Dagestani languages. This work contains an attempt to analyze single-part sentences in the Rutulian language, the syntax of which has not been subjected to special research until now.
APA, Harvard, Vancouver, ISO, and other styles
27

Zhang, Xia, Liu Yuan Chen, and Xin Yan Zhu. "A Natural Language User Demand Semantic Model for Remote Sensing Image Retrieval." Applied Mechanics and Materials 241-244 (December 2012): 2897–900. http://dx.doi.org/10.4028/www.scientific.net/amm.241-244.2897.

Full text
Abstract:
Remote sensing (RS) image can be applied in many domains. Most research work on RS image retrieval is to meet the demand of professional user. However, there are demands for RS image that comes from non-professional users who propose the requests in natural language (NL) not filling in professional request forms. Some problems are needed to be solved to implement RS image retrieval based on NL user demand. The objective of this research was to propose a user demand semantic model to solve the problem of translation from NL user demand to value requirements. Based on plenty of materials investigated in application domains, the syntax and semantics of NL user demand was analyzed. Semantic relationship is summarized in terms of the semantic analysis. After that, a user demand semantic model is proposed and built with ontology. It can be conclude that the proposed semantic model may help to RS image retrieval based on NL user demand.
APA, Harvard, Vancouver, ISO, and other styles
28

Мірончук, Тетяна, and Наталія Одарчук. "Іллокуція англомовного дискурсу виправдання (на прикладі творів сучасної художньої англійської та американської прози)." East European Journal of Psycholinguistics 3, no. 2 (December 22, 2016): 69–81. http://dx.doi.org/10.29038/eejpl.2016.3.2.mir.

Full text
Abstract:
У статті досліджується актоіллокутивний потенціал англійського побутового дискурсу виправдання шляхом зіставлення іллокутивних характеристик частотних у дискурсі виправдання мовленнєвих актів. Спираючись на змодельовані конструкти змісту виправдання, дифенсивну інтенцію мовця визначено передумовою породження дискурсу виправдання. У результаті вивчення наявних у науковій літературі класифікацій мовленнєвих актів визначено, що домінантна іллокутивна сила дискурсу виправдання включає складові інформування та переконування, що типово представлено констативом та асертивом. Власне мовленнєвий акт виправдання визначено як кредитив з включеною перлокуцією винесення виправдального вердикту, яким регулюється міжсуб’єктна взаємодія. Література References Вендлер З. Причинные отношения // Новое в зарубежной лингвистике. – Вып. 18:Логический анализ естественного языка. – М.: Прогресс, 1986. – С. 264–277.Vendler, Z. (1986). Prichinnije otnoshenija [Causal Relations]. In: New in World Linguistics,(pp. 264-277), Issue 18: Study in Logic of Natural Language. Moscow: Progress. Вендлер З. Факты в языке // Философия, логика, язык. – М.: Прогресс, 1987. – С. 293–318.Vendler, Z. (1987). Fakti v jazike [Facts in Language], (pp. 293-318). In: Phylosophy, Logic,Language. Moscow: Progress. Йоргенсен, Марианне В., Филлипс Луиза Дж. (2008). Дискурс-анализ. Теория и метод.Xарьков: Гуманитарный Центр [Humanitarian Centre].Jorgensen, M & Phillips, Louise. (2002). [Discourse Analysis as Theory and Method]. –London; Thousand Oaks; New Delhi. Карабан В. И. Сложные речевые единицы: прагматика английских асиндетическихполипредикативных образований: [монография] / Карабан В. И. – К.: Вища школа, 1989.Karaban, V. I. (1989). Slozhnije rechevije jedinitsi: pragmatika anglijskikh asindeticheskikhpolipredikativnikh obrazovanii [Complex Speech Acts: Pragmatics of English AsyndeticPolypredicative Formations]. Kyiv: Vyshcha Shkola. Остин Дж. Слово как действие // Новое в зарубежной лингвистике. – Вып. 17: ТРА. – М. :Прогресс, 1986. – С. 22–129.Austin, J. (1986). Slovo kak deistvije [Word as Action] In: New in World Linguistics, (pp. 22–129), Issue 17: Speech Acts Theory. M.: Progress. Хилпинен Р. Семантика императивов и деонтическая логика // Новое в зарубежнойлингвистике. – Вып. 18: Логический анализ естественного языка. – М. : Прогресс, 1986. –С. 300–318.Hilpinen, R. (1986). Semantica imperativov i deonticheskaja logica [Semantics of Imperativesand Deontic Logic]. In: New in World Linguistics, (pp. 300–318), Issue 18: Study in Logic ofNatural Language. Moscow: Progress. Шевченко І. С. Дискурс як мисленнєво-комунікативна діяльність / І. С. Шевченко,О. І. Морозова // Дискурс як когнітивно-комунікативний феномен: [кол. монографія] / [зазаг. ред. І. С. Шевченко]. – Х. : Константа, 2005. – С. 21–28.Shevchenko, I. (2005). Dyskurs jak myslenevo-komunikatyvna diyalnist [Discourse as Mentaland Communicative Activity]. In: Discourse as Cognitive and Communicative Phenomenon,(pp. 21–28). I. Shevchenko, (ed.). Kharkiv: Konstanta. Austin, J. L. (1962). How to do Things with Words. Oxford: Oxford University Press. Auwera, J. van der. (1980). On the Meaning of Basic Speech Acts. Journal of Pragmatics, 4(3), 253–303. Auwera, J. van der & Alsenoy, L. van. (2016). On the Typology of Negative Concord. Studiesin Language, 40, 473–512. Bach, K. & Harnish, R. M. (1979). Linguistic Communication and Speech Acts. Cambridge,Mass.: MIT Press. Ballmer, Th. T. & Brennenstuhl, W. (1981). A Study in the Lexical Analysis of EnglishSpeech Activity Verbs. New York, Berlin: Ruhr-Universität. Dijk, T. A. van. (1997). The Study of Discourse. In: Discourse as Structure and Process,(pp. 1–35). London: Sage Publications. Grice, H. P. (1991). Logic & Conversation. Pragmatics, 305–316. Gruber, H. (1998). Disagreeing: Sequential Placement and Internal Structure of Disagreementsin Conflict Episodes. Text, 4 (18), 467–503. Habermas, J. (1981). Theorie des kommunikativen Handelns. In: Handlungsrationalität undgesellschaftliche Rationalisierung. Frankfurt am Main: Suhrkamp. Leech, G. N. (1983). Principles of Pragmatics. New York, London: Longman. Levinson, S. (1983). Pragmatics. London, New York, Melbourne etc: CUP. Rees-Miller, J. (2000). Power, severity & context in disagreement. The Journal of Pragmatics,8 (32), 1087–1111. Searle, J. R. (1979). Expression and Meaning. Cambridge: Cambridge University Press. Schifrin, D. (2001). Handbook of Discourse Analysis. Oxford: Blackwell. Schlieben Lange, Br. (1975). Linguistische Pragmatik. Stuttgart, Berlin: Kohlhammer. Stalnaker, R. (1978). Assertion. In: Syntax & Semantics, (pp. 315–333), Vol. 9: Pragmatics.New York, San Francisco, London. Tatsuki, D. H. (2000). If my complaints could passion move: an interlanguage study ofaggression. The Journal of Pragmatics, 7 (32), 1003–1007. Tannen, D. (1995). You Just Don’t Understand. N.Y.: University of California. Tsui, A. B. M. (1995). English Conversation. Oxford: Oxford University Press. Wunderlich, D. (1980). Methodological Remarks on Speech Act Theory. In: Speech ActTheory & Pragmatics, (pp. 291–312), Vol. 10. Dordrecht : D. Reidel Publ. Comp. Джерела іллюстративного матеріалу Sources Amis, M. (1991). Time’s Arrow. London: Penguin Book. Christie, A. (1945). Death Comes at the End. London: Fontana. Francis, D. (1992). Longshot. New York: Fawcett Crest. Gardner, J. (1987). The Sunlight Dialogues. New York: Vintage Books. James, P.D. (1977). Death of an Expert Witness. London: Penguin Books. O’Hara, J. (1985). Ten North Frederik. New York: Carol and Graph Publ. Pronzini, B. (1990). I didn’t Do It. In: New Crimes, 2, (136–140). London: Robinson Publ.8. Rendel, R. (1985). All Unkindness of Ravens. London: Hutchinson.
APA, Harvard, Vancouver, ISO, and other styles
29

Mutanen, Arto. "Relativity of Visual Communication." Coactivity: Philosophy, Communication 24, no. 1 (March 31, 2016): 24–35. http://dx.doi.org/10.3846/cpc.2016.240.

Full text
Abstract:
Communication is sharing and conveying information. In visual communication especially visual messages have to be formulated and interpreted. The interpretation is relative to a method of information presentation method which is human construction. This holds also in the case of visual languages. The notions of syntax and semantics for visual languages are not so well founded as they are for natural languages. Visual languages are both syntactically and semantically dense. The density is connected to the compositionality of the (pictorial) languages. In the paper Charles Sanders Peirce’s theory of signs will be used in characterizing visual languages. This allows us to relate visual languages to natural languages. The foundation of information presentation methods for visual languages is the logic of perception, but only if perception is understood as propositional perception. This allows us to understand better the relativity of information presentation methods, and hence to evaluate the cultural relativity of visual communication.
APA, Harvard, Vancouver, ISO, and other styles
30

Zhang, Yudong, Wenhao Zheng, and Ming Li. "Learning Uniform Semantic Features for Natural Language and Programming Language Globally, Locally and Sequentially." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5845–52. http://dx.doi.org/10.1609/aaai.v33i01.33015845.

Full text
Abstract:
Semantic feature learning for natural language and programming language is a preliminary step in addressing many software mining tasks. Many existing methods leverage information in lexicon and syntax to learn features for textual data. However, such information is inadequate to represent the entire semantics in either text sentence or code snippet. This motivates us to propose a new approach to learn semantic features for both languages, through extracting three levels of information, namely global, local and sequential information, from textual data. For tasks involving both modalities, we project the data of both types into a uniform feature space so that the complementary knowledge in between can be utilized in their representation. In this paper, we build a novel and general-purpose feature learning framework called UniEmbed, to uniformly learn comprehensive semantic representation for both natural language and programming language. Experimental results on three real-world software mining tasks show that UniEmbed outperforms state-of-the-art models in feature learning and prove the capacity and effectiveness of our model.
APA, Harvard, Vancouver, ISO, and other styles
31

Steedman, Mark. "Information Structure and the Syntax-Phonology Interface." Linguistic Inquiry 31, no. 4 (October 2000): 649–89. http://dx.doi.org/10.1162/002438900554505.

Full text
Abstract:
The article proposes a theory of grammar relating syntax, discourse semantics, and intonational prosody. The full range of English intonational tunes distinguished by Beckman and Pierrehumbert (1986) and their semantic interpretation in terms of focus and information structure are discussed, including “discontinuous” themes and rhemes. The theory extends an earlier account based on Combinatory Categorial Grammar, which directly pairs phonological and logical forms without intermediary representational levels.
APA, Harvard, Vancouver, ISO, and other styles
32

Nikanne, Urpo. "Locative Case Adjuncts in Finnish: Notes on Syntactico-Semantic Interface." Nordic Journal of Linguistics 20, no. 2 (December 1997): 155–78. http://dx.doi.org/10.1017/s0332586500004091.

Full text
Abstract:
This paper discusses Finnish depictive and resultative secondary predicates. The analysis is based on a condition of licensing which can be seen as a version of the principle of Full Interpretation: “A well-formed syntactic structure is licensed only if it can be linked to a well-formed conceptual structure.” The suggested non-trivial theory of the interface between syntax and semantics allows us to avoid unnecessary complexity in syntax, such as small clause and PRO, in secondary predicate adjuncts.
APA, Harvard, Vancouver, ISO, and other styles
33

Orešnik, Janez. "Transitivity in natural syntax : ergative languages." Linguistica 49, no. 1 (December 29, 2009): 65–93. http://dx.doi.org/10.4312/linguistica.49.1.65-93.

Full text
Abstract:
The paper implements the framework of Natural Syntax and treats various phenomena bearing on transitivity using the language material of ergative languages. In each case one ergative and one antipassive constructions are compared, and certain properties of such pairs are predicted. It is new in the paper that it is necessary to distinguish less or more transitive antipassive constructions. In the more transitive ones the agent and the patient are coded with the ergative case, the absolutive case, the nominative case, or the patient is integrated (at least to some extent) into the corresponding verb. More transitive antipassive constructions and the corresponding ergative constructions remain transitive. Because transitivity represents an unnatural environment, the alignment of the corresponding naturalness values is chiastic. The remaining antipassive constructions are less transitive, so that any pair consisting of such a construction and of the corresponding ergative construction withdraws from the unnatural environment of transitivity, and hence the alignment of the corresponding naturalness values is parallel. Another unnatural environment is represented by the patient just in case that its syntactic, not semantic, properties are treated. Consequently the alignment of the corresponding naturalness values ischiastic. The paper discusses 18 ergative languages, mostly from the Caucasus and the Pacific Ocean.
APA, Harvard, Vancouver, ISO, and other styles
34

Roy, Deb. "Learning visually grounded words and syntax of natural spoken language." Evolution of Communication 4, no. 1 (December 31, 2001): 33–56. http://dx.doi.org/10.1075/eoc.4.1.04roy.

Full text
Abstract:
Properties of the physical world have shaped human evolutionary design and given rise to physically grounded mental representations. These grounded representations provide the foundation for higher level cognitive processes including language. Most natural language processing machines to date lack grounding. This paper advocates the creation of physically grounded language learning machines as a path toward scalable systems which can conceptualize and communicate about the world in human-like ways. As steps in this direction, two experimental language acquisition systems are presented. The first system, CELL, is able to learn acoustic word forms and associated shape and color categories from fluent untranscribed speech paired with video camera images. In evaluations, CELL has successfully learned from spontaneous infant-directed speech. A version of CELL has been implemented in a robotic embodiment which can verbally interact with human partners. The second system, DESCRIBER, acquires a visually-grounded model of natural language which it uses to generate spoken descriptions of objects in visual scenes. Input to DESCRIBER’s learning algorithm consists of computer generated scenes paired with natural language descriptions produced by a human teacher. DESCRIBER learns a three-level language model which encodes syntactic and semantic properties of phrases, word classes, and words. The system learns from a simple ‘show-and-tell’ procedure, and once trained, is able to generate semantically appropriate, contextualized, and syntactically well-formed descriptions of objects in novel scenes.
APA, Harvard, Vancouver, ISO, and other styles
35

Choudhary, Jaytrilok, and Deepak Singh Tomar. "Semi-Automated Ontology building through Natural Language Processing." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 13, no. 8 (August 23, 2014): 4738–46. http://dx.doi.org/10.24297/ijct.v13i8.7072.

Full text
Abstract:
Ontology is a backbone of semantic web which is used for domain knowledge representation. Ontology provides the platform for effective extraction of information. Usually, ontology is developed manually, but the manual ontology construction requires lots of efforts by domain experts. It is also time consuming and costly. Thus, an approach to build ontology in semi-automated manner has been proposed. The proposed approach extracts concept automatically from open directory Dmoz. The Stanford Parser is explored to parse natural language syntax and extract the parts of speech which are used to form the relationship among the concepts. The experimental result shows a fair degree of accuracy which may be improved in future with more sophisticated approach.
APA, Harvard, Vancouver, ISO, and other styles
36

Sukaton, Ounu. "Semantics in Natural Language Processing and Language Teaching." ELS Journal on Interdisciplinary Studies in Humanities 2, no. 1 (March 28, 2019): 58–65. http://dx.doi.org/10.34050/els-jish.v2i1.6008.

Full text
Abstract:
Semantics is one of the key elements in the study of language. This article’s main objective is to describe and evaluate how semantic theories are implemented in the field of Natural Language Processing as well as Language Teaching. There are two branches of semantic theories that this article focuses on: Formal Semantics and Natural Semantic Metalanguage. The strengths and weaknesses of the theories mentioned are discussed. The most versatile theory is suggested along with future improvements.
APA, Harvard, Vancouver, ISO, and other styles
37

Wu, Zhaofeng, Hao Peng, and Noah A. Smith. "Infusing Finetuning with Semantic Dependencies." Transactions of the Association for Computational Linguistics 9 (2021): 226–42. http://dx.doi.org/10.1162/tacl_a_00363.

Full text
Abstract:
Abstract For natural language processing systems, two kinds of evidence support the use of text representations from neural language models “pretrained” on large unannotated corpora: performance on application-inspired benchmarks (Peters et al., 2018, inter alia), and the emergence of syntactic abstractions in those representations (Tenney et al., 2019, inter alia). On the other hand, the lack of grounded supervision calls into question how well these representations can ever capture meaning (Bender and Koller, 2020). We apply novel probes to recent language models— specifically focusing on predicate-argument structure as operationalized by semantic dependencies (Ivanova et al., 2012)—and find that, unlike syntax, semantics is not brought to the surface by today’s pretrained models. We then use convolutional graph encoders to explicitly incorporate semantic parses into task-specific finetuning, yielding benefits to natural language understanding (NLU) tasks in the GLUE benchmark. This approach demonstrates the potential for general-purpose (rather than task-specific) linguistic supervision, above and beyond conventional pretraining and finetuning. Several diagnostics help to localize the benefits of our approach.1
APA, Harvard, Vancouver, ISO, and other styles
38

Manca, Vicenzo, and María Dolores Jiménez López. "GNS: Abstract Syntax for Natural Languages." Triangle, no. 8 (June 29, 2018): 55. http://dx.doi.org/10.17345/triangle8.55-79.

Full text
Abstract:
This paper presents an overview of General Natural Syntax (GNS), a formal theory of general explicative power that generalizes and formalizes syntactic concepts in order to oer a general notion of syntax that is independent of any particular language.
APA, Harvard, Vancouver, ISO, and other styles
39

James, Praveen Edward, Mun Hou Kit, Chockalingam Aravind Vaithilingam, and Alan Tan Wee Chiat. "An Integrated Process Based Natural Language Processing System." Journal of Computational and Theoretical Nanoscience 17, no. 4 (April 1, 2020): 1842–46. http://dx.doi.org/10.1166/jctn.2020.8451.

Full text
Abstract:
Natural Language Processing (NLP) systems involve Natural Language Understanding (NLU), Dialogue Management (DM) and Natural Language Generation (NLG). The purpose of this work involves integrating learning with examples and rule-based processing to design an NLP system. The design involves a three-stage processing framework, which combines syntactic generation, semantic extraction and a strong rule-based control. The syntactic generator generates syntax by aligning sentences with Part-of-Speech (POS) tags limited by the number of words in the lexicon. The semantic extractor extracts meaningful keywords from the queries raised. The above two modules are controlled by generalized rules by the rule-based controller module. The system is evaluated under different domains. The results reveal that the accuracy of the system is 92.33% on an average. The design process is simple, and the processing time is 2.12 seconds, which is minimal compared to similar statistical models. The performance of an NLP tool in a certain task can be estimated by the quality of its predictions on the classification of unseen data. The results reveal similar performance with existing systems indicating the possibility of usage for similar tasks. The system supports a vocabulary of about 700 words and can be used as an NLP module in a spoken dialogue system for various domains or task areas.
APA, Harvard, Vancouver, ISO, and other styles
40

THOMPSON, ELLEN. "Temporal dependency and the syntax of subjects." Journal of Linguistics 37, no. 2 (July 2001): 287–312. http://dx.doi.org/10.1017/s0022226701008854.

Full text
Abstract:
This article explores the interface between the syntactic and semantic representation of natural language with respect to the interpretation of time. The main claim of the paper is that the semantic relationship of temporal dependency requires syntactic locality at LF. Based on this claim, I explore the syntax and semantics of gerundive relative clauses. I argue that since gerundive relatives are temporally dependent on the tense of the main clause, they need to be local with a temporal element of the main clause at LF. I show that gerundive relatives receive different temporal interpretations depending on their syntactic position at LF. This analysis sheds light on the behavior of gerundive relatives in constructions involving coordination, existential there, scope of quantificational and cardinality adverbials, extraposition, presuppositionality effects and binding-theoretic reconstruction effects.
APA, Harvard, Vancouver, ISO, and other styles
41

Amirbekova Aigul Baydebekkyzy, Khabiyeva Almagul, Soltanbekova Alfia, and Taubaldiyev Meirambek. "COMPATIBILITY OF LANGUAGE UNITS IN THE KAZAKH LANGUAGE AND THEIR CAPABILITY." Science Review, no. 7(24) (September 30, 2019): 36–39. http://dx.doi.org/10.31435/rsglobal_sr/30092019/6682.

Full text
Abstract:
The article deals with the concept of valency as a phenomenon lying at the confluence of syntax and lexical semantics. The paper also represents types of valency, directions in which the theory of valency is considered. Valency in the broad sense of the word refers to the capacity of a language unit to enter into communication with other units of a particular order. Objectivity and scientific and practical significance of the theory of valency is determined by the lexical- semantic potential of the word. Semantic valency is based on the logical semes of the word semantics. These semes are consistent with the logical semes of the another word meanings, as a result, the given word demonstrates the combining capability with another word. This is considered to be its semantic valency. We have attempted to identify and investigate a peculiar kind of valency in the Kazakh language. We use the concepts of valency and compatibility as synonyms, but in a number of works they are distinguished.
APA, Harvard, Vancouver, ISO, and other styles
42

جمعة جمعة, مؤيد, and زينب محمود الكواز. "Syntax-Semantics Interface in Linguistic Theory." Al-Adab Journal 1, no. 122 (December 9, 2018): 41–66. http://dx.doi.org/10.31973/aj.v1i122.233.

Full text
Abstract:
According to a process called selected focusing, the linguist in order to produce a coherent statement or an adequate description has to focus on one aspect of a language and exclude the others. Yet, such isolation is only an artificial element. A layman or a child does not have a least idea about the various levels of language. Yet, he is very-well equipped with the grammatical, structural, and semantic tools that help him to instantly identify the ill-formed or unmeaningful sentences of his native language as language is learned and taught as a whole. With regard to syntax-semantics interface in linguistic literature, two opposite mainstreams have been found; a syntactically- oriented perspective (Chomsky 1957, 65, 79, 81, Cullicover 1976, Radford 1988, Horrock 1987, and Haegman 1992) modified and supported later on by the Optimality Theory approach (henceforth OT) established by Alan Prince and Paul Smolensky (1993) and a semantically-oriented one in its two facets the generative and the interpretive (Jerrold J. Katz & Jerry A. Fodor: 1963, George Lakoff 1963) developed in some of its aspects by Charles Fillmore's case grammar (1968). Furthermore, a great deal of effort has been proposed in line with these two opposite approaches to produce some experimental psycholinguistic and neurolinguistic studies to support or reject one or both of them (Millar & Mckean 1964, Savin & Perchonock 1965, and Clifton & Odom 1966, Gleason, J. & Ratner, N. 1993, Friederici, Angela D., & Jürgen Weissenborn 2007). The early generative transformational approach went too far in insisting that the syntactic aspect has an autonomous characteristic and should be dealt with in isolation from semantics; others argue that they are interrelated and cannot be separated. Some linguists as the generative semanticists consider semantics as more basic in grammatical description than syntax; whereas, others hold a totally reversed approach assuming that semantics cannot be described and it should be considered as an extra-linguistic element. This paper is at attempt to shed some light on this serious linguistic controversy to arrive at some general outlines that might help the linguistic theorists, language second/foreign teachers and students to establish a scientific scheme in dealing with language.
APA, Harvard, Vancouver, ISO, and other styles
43

Wybraniec-Skardowska, Urszula. "On Language Adequacy." Studies in Logic, Grammar and Rhetoric 40, no. 1 (March 1, 2015): 257–92. http://dx.doi.org/10.1515/slgr-2015-0013.

Full text
Abstract:
Abstract The paper concentrates on the problem of adequate reflection of fragments of reality via expressions of language and inter-subjective knowledge about these fragments, called here, in brief, language adequacy. This problem is formulated in several aspects, the most general one being: the compatibility of the language syntax with its bi-level semantics: intensional and extensional. In this paper, various aspects of language adequacy find their logical explication on the ground of the formal-logical theory of syntax T of any categorial language L generated by the so-called classical categorial grammar, and also on the ground of its extension to the bi-level, intensional and ex- tensional semantic-pragmatic theory ST for L. In T, according to the token- type distinction of Ch. S. Peirce, L is characterized first as a language of wellformed expression-tokens (wfe-tokens) - material, concrete objects - and then as a language of wfe-types - abstract objects, classes of wfe-tokens. In ST the semantic-pragmatic notions of meaning and interpretation for wfe-types of L of intensional semantics and the notion of denotation of extensional seman- tics for wfe-types and constituents of knowledge are formalized. These notions allow formulating a postulate (an axiom of categorial adequacy) from which follow all the most important conditions of the language adequacy, including the above, and a structural one connected with three principles of compositionality.
APA, Harvard, Vancouver, ISO, and other styles
44

Anderson, John M. "Structuralism and Autonomy: From Saussure to Chomsky." Historiographia Linguistica International Journal for the History of the Language Sciences 32, no. 1-2 (2005): 117–48. http://dx.doi.org/10.1075/hl.32.1-2.06and.

Full text
Abstract:
Structuralism sought to introduce various kinds of autonomy into the study of language, including the autonomy of that study itself. The basis for this was the insistence on categorial autonomy, whereby categories are identified language-internally (whether in a particular language or in language generally). In relation to phonology, categorial autonomy has generally been tempered by grounding: the categories correlate (at least prototypically) with substance, phonetic properties. This is manifested in the idea of ‘natural classes’ in generative phonology, for instance. Usually, however, and particularly since Bloomfield, no such grounding (in meaning) has been attributed to syntax. This attitude culminates in the principle of the autonomy of syntax which was put forward in transformational-generative grammar. Such an attitude can be contrasted not merely with most pre-structural linguistics but also, in its severity, with other developments in structuralism. In present-day terms, the groundedness of syntax assumes that only the behaviour of semantically typical members of a category determines its basic syntax, and this syntax reflects the semantic properties; groundedness filters out potential syntactic analyses that are incompatible with this.
APA, Harvard, Vancouver, ISO, and other styles
45

Anderson, John M. "Structuralism and Autonomy." Historiographia Linguistica 32, no. 1-2 (June 8, 2005): 117–48. http://dx.doi.org/10.1075/hl.32.2.06and.

Full text
Abstract:
Summary Structuralism sought to introduce various kinds of autonomy into the study of language, including the autonomy of that study itself. The basis for this was the insistence on categorial autonomy, whereby categories are identified language-internally (whether in a particular language or in language generally). In relation to phonology, categorial autonomy has generally been tempered by grounding: the categories correlate (at least prototypically) with substance, phonetic properties. This is manifested in the idea of ‘natural classes’ in generative phonology, for instance. Usually, however, and particularly since Bloomfield, no such grounding (in meaning) has been attributed to syntax. This attitude culminates in the principle of the autonomy of syntax which was put forward in transformational-generative grammar. Such an attitude can be contrasted not merely with most pre-structural linguistics but also, in its severity, with other developments in structuralism. In present-day terms, the groundedness of syntax assumes that only the behaviour of semantically typical members of a category determines its basic syntax, and this syntax reflects the semantic properties; groundedness filters out potential syntactic analyses that are incompatible with this.
APA, Harvard, Vancouver, ISO, and other styles
46

Penz, Yuri, and Ana Maria Tramunt Ibaños. "Tardis & Tame: an Essay on Natural Language Meaning and Metaphysics." Scripta 24, no. 51 (September 23, 2020): 71–102. http://dx.doi.org/10.5752/p.2358-3428.2020v24n51p71-102.

Full text
Abstract:
This paper theoretically approaches the relationship between the instances of time and space as far as natural language conveys their manifestation, focusing on meaning, majorly represented by the subdiscipline of Semantics, along with some insights on Syntax and Pragmatics. The outset-designed ontology is mainly composed by categories of TAME (tense, aspect, mood and evidentiality/eventology) instantiated by linguistic phenomena that yelds anchoring, displacement and aboutness properties. To achieve this wide range of linguistic manifestations the scope takes over the verbal semantics of Brazilian Portuguese (henceforth BrP), pursuing to manage the lexical nature of entries in this language and its metaphysical counterpart in meaning. This sort of approach intends to illustrate the proper balance for the formal device of a semantic component regarding these language parameters on TAME and their principles correlation on human language by means of this paper intends to coin as TARDIS. This paper presents three sections such as: a) theoretical, introducing the properties of each category of TAME, throughout history of Linguistics and Semantics; b) methodological, characterizing the lexical/metaphysical dualism for Formal Semantic approaches and their correlation to time and space and some other non-logical privileged concepts entertained by semantic knowledge; c) epistemological and analytical, considering BrP as parameters for TARDIS’ properties conveyed by natural language meaning throughout different categories of TAME and its correlation to some principles which seem to regard human cognition, focusing mainly on modality.
APA, Harvard, Vancouver, ISO, and other styles
47

Egli, Urs. "Stoic syntax and semantics." Historiographia Linguistica 13, no. 2-3 (January 1, 1986): 281–306. http://dx.doi.org/10.1075/hl.13.2-3.09egl.

Full text
Abstract:
Summary The Stoic theory of loquia (lekta) contained a fairly explicit statement of formation rules. It is argued that one type of rule was called syntaxis (combination or phrase structure rule) by Chrysippus (e.g., “a subject in the nominative case and a complete predicate form a statement”). Two other types of rule were assignments of words to lexical categories (“Dion is a Noun Phrase”) and subsumption rules (“Every elementary statement is a statement”), often formulated in the form of subdivisions of concepts. A fourth type of rule seems to have been the class of transformations (enklisis, e.g., “A statement transformed by the preterite transformation is a statement”). Every syntactic rule was accompanied by a semantic interpretation according to a version of the compositionality principle familiar in modern times since Frege and elaborated by Montague and his followers. Though the concrete example of a syntax was a fairly elaborate version of some sort of Montague type or definite clause grammar, there was no effort to introduce a theory of grammar in the style of Chomsky. But the texts show awareness of the problem of the infinity of structure generated and of the concept of structural ambiguity. The Stoic system has been transformed into the formulation of the Word and Paradigm Grammar of the technical grammarians – “transformation” (enklisis) was the historical antecedent of paragôgê, declinatio, “inflection”, etc. Some formulations have survived into modern times, e.g., the notion of government, for which Stoic type formulations like “a deficient predicate can be combined with a subject in the accusative case to form a complete predicate” are a historical antecedent.
APA, Harvard, Vancouver, ISO, and other styles
48

Amirbekova, A. "THE VALENCE PROPERTY OF NEOLOGISMS IN KAZAKH LANGUAGE." BULLETIN Series of Philological Sciences 73, no. 3 (July 15, 2020): 14–21. http://dx.doi.org/10.51889/2020-3.1728-7804.02.

Full text
Abstract:
The article deals with the concept of valency as a phenomenon lying at the confluence of syntax and lexical semantics. The paper also represents types of valency, directions in which the theory of valency is considered. Valency in the broad sense of the word refers to the capacity of a language unit to enter into communication with other units of a particular order. Objectivity and scientific and practical significance of the theory of valency is determined by the lexical-semantic potential of the word. Semantic valency is based on the logical semes of the word semantics. These semes are consistent with the logical semes of the another word meanings, as a result, the given word demonstrates the combining capability with another word. This is considered to be its semantic valency. We have attempted to identify and investigate a peculiar kind of valency in the Kazakh language.
APA, Harvard, Vancouver, ISO, and other styles
49

Long, Congjun, Xuewen Zhou, and Maoke Zhou. "Recognition of Tibetan Maximal-length Noun Phrases Based on Syntax Tree." ACM Transactions on Asian and Low-Resource Language Information Processing 20, no. 2 (March 30, 2021): 1–13. http://dx.doi.org/10.1145/3423324.

Full text
Abstract:
Frequently corresponding to syntactic components, the Maximal-length Noun Phrase (MNP) possesses abundant syntactic and semantic information and acts a certain semantic role in sentences. Recognition of MNP plays an important role in Natural Language Processing and lays the foundation for analyzing and understanding sentence structure and semantics. By comparing the essence of different MNPs, this article defines the MNP in the Tibetan language from the perspective of syntax tree. A total of 6,038 sentences are extracted from the syntax tree corpus, the structure type, boundary feature, and frequency of MNPs are analyzed, and the MNPs are recognized by applying the sequence tagging model and the syntactic analysis model. The accuracy, recall, and F1 score of the recognition results of applying sequence tagging model are 87.14%, 84.72%, and 85.92%, respectively. The accuracy, recall, and F1 score of the recognition results of applying syntactic analysis model are 87.66%, 87.63%, and 87.65%, respectively.
APA, Harvard, Vancouver, ISO, and other styles
50

Baud, R. H., A. M. Rassinoux, J. C. Wagner, C. Lovis, C. Juge, L. L. Alpay, P. A. Michel, P. Degoulet, and J. R. Scherrer. "Representing Clinical Narratives Using Conceptual Graphs." Methods of Information in Medicine 34, no. 01/02 (1995): 176–86. http://dx.doi.org/10.1055/s-0038-1634586.

Full text
Abstract:
Abstract:The analysis of medical narratives and the generation of natural language expressions are strongly dependent on the existence of an adequate representation language. Such a language has to be expressive enough in order to handle the complexity of human reasoning in the domain. Sowa’s Conceptual Graphs (CG) are an answer, and this paper presents a multilingual implementation, using French, English and German. Current developments demonstrate the feasibility of an approach to natural Language Understanding where semantic aspects are dominant, in contrast, to syntax driven methods. The basic idea is to aggregate blocks of words according to semantic compatibility rules, following a method called Proximity Processing. The CG representation is gradually built, starting from single words in a semantic lexicon, to finally give a complete representation of the sentence under the form of a single CG. The process is dependent on specific rules of the medical domain, and for this reason is largely controlled by the declarative knowledge of the medical Linguistic Knowlege Base.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography