To see the other types of publications on this topic, follow the link: Probabilistic grammar.

Journal articles on the topic 'Probabilistic grammar'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Probabilistic grammar.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Nitay, Dolav, Dana Fisman, and Michal Ziv-Ukelson. "Learning of Structurally Unambiguous Probabilistic Grammars." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 10 (2021): 9170–78. http://dx.doi.org/10.1609/aaai.v35i10.17107.

Full text
Abstract:
The problem of identifying a probabilistic context free grammar has two aspects: the first is determining the grammar's topology (the rules of the grammar) and the second is estimating probabilistic weights for each rule. Given the hardness results for learning context-free grammars in general, and probabilistic grammars in particular, most of the literature has concentrated on the second problem. In this work we address the first problem. We restrict attention to structurally unambiguous weighted context-free grammars (SUWCFG) and provide a query learning algorithm for strucuturally unambiguo
APA, Harvard, Vancouver, ISO, and other styles
2

KROTOV, ALEXANDER, MARK HEPPLE, ROBERT GAIZAUSKAS, and YORICK WILKS. "Evaluating two methods for Treebank grammar compaction." Natural Language Engineering 5, no. 4 (1999): 377–94. http://dx.doi.org/10.1017/s1351324900002308.

Full text
Abstract:
Treebanks, such as the Penn Treebank, provide a basis for the automatic creation of broad coverage grammars. In the simplest case, rules can simply be ‘read off’ the parse-annotations of the corpus, producing either a simple or probabilistic context-free grammar. Such grammars, however, can be very large, presenting problems for the subsequent computational costs of parsing under the grammar. In this paper, we explore ways by which a treebank grammar can be reduced in size or ‘compacted’, which involve the use of two kinds of technique: (i) thresholding of rules by their number of occurrences;
APA, Harvard, Vancouver, ISO, and other styles
3

Benedikt Szmrecsanyi. "Diachronic Probabilistic Grammar." English Language and Linguistics 19, no. 3 (2013): 41–68. http://dx.doi.org/10.17960/ell.2013.19.3.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Daland, Robert. "Long words in maximum entropy phonotactic grammars." Phonology 32, no. 3 (2015): 353–83. http://dx.doi.org/10.1017/s0952675715000251.

Full text
Abstract:
A phonotactic grammar assigns a well-formedness score to all possible surface forms. This paper considers whether phonotactic grammars should be probabilistic, and gives several arguments that they need to be. Hayes & Wilson (2008) demonstrate the promise of a maximum entropy Harmonic Grammar as a probabilistic phonotactic grammar. This paper points out a theoretical issue with maxent phonotactic grammars: they are not guaranteed to assign a well-defined probability distribution, because sequences that contain arbitrary repetitions of unmarked sequences may be underpenalised. The paper mot
APA, Harvard, Vancouver, ISO, and other styles
5

Shih, Stephanie S. "Constraint conjunction in weighted probabilistic grammar." Phonology 34, no. 2 (2017): 243–68. http://dx.doi.org/10.1017/s0952675717000136.

Full text
Abstract:
This paper examines a key difference between constraint conjunction and constraint weight additivity, arguing that the two do not have the same empirical coverage. In particular, constraint conjunction in weighted probabilistic grammar allows for superadditive constraint interaction, where the effect of violating two constraints goes beyond the additive combination of the two constraints’ weights alone. A case study from parasitic tone harmony in Dioula d'Odienné demonstrates superadditive local and long-distance segmental feature similarities that increase the likelihood of tone harmony. Supe
APA, Harvard, Vancouver, ISO, and other styles
6

CASACUBERTA, FRANCISCO. "GROWTH TRANSFORMATIONS FOR PROBABILISTIC FUNCTIONS OF STOCHASTIC GRAMMARS." International Journal of Pattern Recognition and Artificial Intelligence 10, no. 03 (1996): 183–201. http://dx.doi.org/10.1142/s0218001496000153.

Full text
Abstract:
Stochastic Grammars are the most usual models in Syntactic Pattern Recognition. Both components of a Stochastic Grammar, the characteristic grammar and the probabilities attached to the rules, can be learnt automatically from training samples. In this paper, first a review of some algorithms are presented to infer the probabilistic component of Stochastic Regular and Context-Free Grammars under the framework of the Growth Transformations. On the other hand, with Stochastic Grammars, the patterns must be represented as strings over a finite set of symbols. However, the most natural representati
APA, Harvard, Vancouver, ISO, and other styles
7

Han, Young S., and Key-Sun Choi. "Best parse parsing with Earley's and Inside algorithms on probabilistic RTN." Natural Language Engineering 1, no. 2 (1995): 147–61. http://dx.doi.org/10.1017/s1351324900000127.

Full text
Abstract:
AbstractInside parsing is a best parse parsing method based on the Inside algorithm that is often used in estimating probabilistic parameters of stochastic context free grammars. It gives a best parse in O(N3G3) time where N is the input size and G is the grammar size. Earley algorithm can be made to return best parses with the same complexity in N.By way of experiments, we show that Inside parsing can be more efficient than Earley parsing with sufficiently large grammar and sufficiently short input sentences. For instance, Inside parsing is better with sentences of 16 or less words for a gram
APA, Harvard, Vancouver, ISO, and other styles
8

Kita, Kenji. "Mixture Probabilistic Context-Free Grammar." Journal of Natural Language Processing 3, no. 4 (1996): 103–13. http://dx.doi.org/10.5715/jnlp.3.4_103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

DAI, Yin-Tang, Cheng-Rong WU, Sheng-Xiang MA, and Yi-Ping ZHONG. "Hierarchically Classified Probabilistic Grammar Parsing." Journal of Software 22, no. 2 (2011): 245–57. http://dx.doi.org/10.3724/sp.j.1001.2011.03809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Arthi, K., and Kamala Krithivasan. "Probabilistic Parallel Communicating Grammar Systems." International Journal of Computer Mathematics 79, no. 1 (2002): 1–26. http://dx.doi.org/10.1080/00207160211914.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Lioutikov, Rudolf, Guilherme Maeda, Filipe Veiga, Kristian Kersting, and Jan Peters. "Learning attribute grammars for movement primitive sequencing." International Journal of Robotics Research 39, no. 1 (2019): 21–38. http://dx.doi.org/10.1177/0278364919868279.

Full text
Abstract:
Movement primitives are a well studied and widely applied concept in modern robotics. However, composing primitives out of an existing library has shown to be a challenging problem. We propose the use of probabilistic context-free grammars to sequence a series of primitives to generate complex robot policies from a given library of primitives. The rule-based nature of formal grammars allows an intuitive encoding of hierarchically structured tasks. This hierarchical concept strongly connects with the way robot policies can be learned, organized, and re-used. However, the induction of context-fr
APA, Harvard, Vancouver, ISO, and other styles
12

BENDER, EMILY M. "Socially meaningful syntactic variation in sign-based grammar." English Language and Linguistics 11, no. 2 (2007): 347–81. http://dx.doi.org/10.1017/s1360674307002286.

Full text
Abstract:
In this article, I investigate the implications of socially meaningful sociolinguistic variation for competence grammar, working from the point of view of HPSG as a kind of performance-plausible sign-based grammar. Taking data from African American Vernacular English variable copula absence as a case study, I argue that syntactic constraints and social meaning are intertwined. I present an overview of the literature on social meaning, discuss what grammars are models of, and argue that in order to model socially meaningful variation, competence grammars need to be extended to include social me
APA, Harvard, Vancouver, ISO, and other styles
13

Winters, Thomas, Giuseppe Marra, Robin Manhaeve, and Luc De Raedt. "DeepStochLog: Neural Stochastic Logic Programming." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 9 (2022): 10090–100. http://dx.doi.org/10.1609/aaai.v36i9.21248.

Full text
Abstract:
Recent advances in neural-symbolic learning, such as DeepProbLog, extend probabilistic logic programs with neural predicates. Like graphical models, these probabilistic logic programs define a probability distribution over possible worlds, for which inference is computationally hard. We propose DeepStochLog, an alternative neural-symbolic framework based on stochastic definite clause grammars, a kind of stochastic logic program. More specifically, we introduce neural grammar rules into stochastic definite clause grammars to create a framework that can be trained end-to-end. We show that infere
APA, Harvard, Vancouver, ISO, and other styles
14

Mukhtar, Neelam, Mohammad Abid Khan, and Fatima TuzZuhra. "Probabilistic Context Free Grammar for Urdu." Linguistics and Literature Review 2, no. 2 (2016): 109–16. http://dx.doi.org/10.32350/llr.22.04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Whiting, Mark E., Jonathan Cagan, and Philip LeDuc. "Efficient probabilistic grammar induction for design." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 32, no. 2 (2018): 177–88. http://dx.doi.org/10.1017/s0890060417000464.

Full text
Abstract:
AbstractThe use of grammars in design and analysis has been set back by the lack of automated ways to induce them from arbitrarily structured datasets. Machine translation methods provide a construct for inducing grammars from coded data which have been extended to be used for design through pre-coded design data. This work introduces a four-step process for inducing grammars from un-coded structured datasets which can constitute a wide variety of data types, including many used in the design. The method includes: (1) extracting objects from the data, (2) forming structures from objects, (3) e
APA, Harvard, Vancouver, ISO, and other styles
16

Nederhof, Mark-Jan. "A General Technique to Train Language Models on Language Models." Computational Linguistics 31, no. 2 (2005): 173–85. http://dx.doi.org/10.1162/0891201054223986.

Full text
Abstract:
We show that under certain conditions, a language model can be trained on the basis of a second language model. The main instance of the technique trains a finite automaton on the basis of a probabilistic context-free grammar, such that the Kullback-Leibler distance between grammar and trained automaton is provably minimal. This is a substantial generalization of an existing algorithm to train an n-gram model on the basis of a probabilistic context-free grammar.
APA, Harvard, Vancouver, ISO, and other styles
17

SRINIVAS, B. "Explanation-based learning and finite state transducers: applications to parsing lexicalized tree adjoining grammars." Natural Language Engineering 2, no. 4 (1996): 367–68. http://dx.doi.org/10.1017/s1351324997001642.

Full text
Abstract:
There are currently two philosophies for building grammars and parsers: hand-crafted, wide coverage grammars; and statistically induced grammars and parsers. Aside from the methodological differences in grammar construction, the linguistic knowledge which is overt in the rules of handcrafted grammars is hidden in the statistics derived by probabilistic methods, which means that generalizations are also hidden and the full training process must be repeated for each domain. Although handcrafted wide coverage grammars are portable, they can be made more efficient when applied to limited domains,
APA, Harvard, Vancouver, ISO, and other styles
18

Jin, Lifeng, Finale Doshi-Velez, Timothy Miller, William Schuler, and Lane Schwartz. "Unsupervised Grammar Induction with Depth-bounded PCFG." Transactions of the Association for Computational Linguistics 6 (December 2018): 211–24. http://dx.doi.org/10.1162/tacl_a_00016.

Full text
Abstract:
There has been recent interest in applying cognitively- or empirically-motivated bounds on recursion depth to limit the search space of grammar induction models (Ponvert et al., 2011; Noji and Johnson, 2016; Shain et al., 2016). This work extends this depth-bounding approach to probabilistic context-free grammar induction (DB-PCFG), which has a smaller parameter space than hierarchical sequence models, and therefore more fully exploits the space reductions of depth-bounding. Results for this model on grammar acquisition from transcribed child-directed speech and newswire text exceed or are com
APA, Harvard, Vancouver, ISO, and other styles
19

Claes, Jeroen. "Probabilistic Grammar: The view from Cognitive Sociolinguistics." Glossa: a journal of general linguistics 2, no. 1 (2017): 62. http://dx.doi.org/10.5334/gjgl.298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Lindemann, Eric. "Virtual instrument player using probabilistic gesture grammar." Journal of the Acoustical Society of America 130, no. 4 (2011): 2432. http://dx.doi.org/10.1121/1.3654750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

BRESNAN, JOAN, ASHWINI DEO, and DEVYANI SHARMA. "Typology in variation: a probabilistic approach to be and n't in the Survey of English Dialects." English Language and Linguistics 11, no. 2 (2007): 301–46. http://dx.doi.org/10.1017/s1360674307002274.

Full text
Abstract:
Variation within grammars is a reflection of variation between grammars. Subject agreement and synthetic negation for the verb be show extraordinary local variation in the Survey of English Dialects (Orton et al., 1962–71). Extracting partial grammars of individuals, we confirm leveling patterns across person, number, and negation (Ihalainen, 1991; Cheshire, Edwards & Whittle, 1993; Cheshire, 1996). We find that individual variation bears striking structural resemblances to invariant dialect paradigms, and also reflects typologically observed markedness properties (Aissen, 1999). In the fr
APA, Harvard, Vancouver, ISO, and other styles
22

Keller, Frank, and Ash Asudeh. "Probabilistic Learning Algorithms and Optimality Theory." Linguistic Inquiry 33, no. 2 (2002): 225–44. http://dx.doi.org/10.1162/002438902317406704.

Full text
Abstract:
This article provides a critical assessment of the Gradual Learning Algorithm (GLA) for probabilistic optimality-theoretic (OT) grammars proposed by Boersma and Hayes (2001). We discuss the limitations of a standard algorithm for OT learning and outline how the GLA attempts to overcome these limitations. We point out a number of serious shortcomings with the GLA: (a) A methodological problem is that the GLA has not been tested on unseen data, which is standard practice in computational language learning. (b) We provide counterexamples, that is, attested data sets that the GLA is not able to le
APA, Harvard, Vancouver, ISO, and other styles
23

Kato, Yoshihide, Shigeki Matsubara, Katsuhiko Toyama, and Yasuyoshi Inagaki. "Incremental Parsing Based on Probabilistic Context Free Grammar." IEEJ Transactions on Electronics, Information and Systems 122, no. 12 (2002): 2109–19. http://dx.doi.org/10.1541/ieejeiss1987.122.12_2109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Khoufi, Nabil, Chafik Aloulou, and Lamia Hadrich Belguith. "Parsing Arabic using induced probabilistic context free grammar." International Journal of Speech Technology 19, no. 2 (2015): 313–23. http://dx.doi.org/10.1007/s10772-015-9300-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Liu, Tianqiang, Siddhartha Chaudhuri, Vladimir G. Kim, Qixing Huang, Niloy J. Mitra, and Thomas Funkhouser. "Creating consistent scene graphs using a probabilistic grammar." ACM Transactions on Graphics 33, no. 6 (2014): 1–12. http://dx.doi.org/10.1145/2661229.2661243.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Li, Dan, Disheng Hu, Yuke Sun, and Yingsong Hu. "3D scene reconstruction using a texture probabilistic grammar." Multimedia Tools and Applications 77, no. 21 (2018): 28417–40. http://dx.doi.org/10.1007/s11042-018-6052-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Clark, Alexander, and Nathanaël Fijalkow. "Consistent Unsupervised Estimators for Anchored PCFGs." Transactions of the Association for Computational Linguistics 8 (July 2020): 409–22. http://dx.doi.org/10.1162/tacl_a_00323.

Full text
Abstract:
Learning probabilistic context-free grammars (PCFGs) from strings is a classic problem in computational linguistics since Horning ( 1969 ). Here we present an algorithm based on distributional learning that is a consistent estimator for a large class of PCFGs that satisfy certain natural conditions including being anchored (Stratos et al., 2016 ). We proceed via a reparameterization of (top–down) PCFGs that we call a bottom–up weighted context-free grammar. We show that if the grammar is anchored and satisfies additional restrictions on its ambiguity, then the parameters can be directly relate
APA, Harvard, Vancouver, ISO, and other styles
28

Carter, Ronald, and Michael McCarthy. "The English get-passive in spoken discourse: description and implications for an interpersonal grammar." English Language and Linguistics 3, no. 1 (1999): 41–58. http://dx.doi.org/10.1017/s136067439900012x.

Full text
Abstract:
Using a 1.5-million-word sample from the CANCODE spoken English corpus, we present a description of the get-passive in informal spoken British English. Previous studies of the get-passive are reviewed, and their focus on contextual and interpersonal meanings is noted. A number of related structures are then considered and the possibility of a passive gradient is discussed. The corpus sample contains 139 get-passives of the type X get + past participle (by Y) (e.g. He got killed), of which 124 occur in contexts interpreted as adversative or problematic from the speaker's viewpoint. Very few exa
APA, Harvard, Vancouver, ISO, and other styles
29

Auger, Julie. "Phonological variation and Optimality Theory: Evidence from word-initial vowel epenthesis in Vimeu Picard." Language Variation and Change 13, no. 3 (2001): 253–303. http://dx.doi.org/10.1017/s0954394501133016.

Full text
Abstract:
One striking feature of Vimeu Picard concerns the regular insertion of epenthetic vowels in order to break up consonant clusters and to syllabify word-initial and word-final consonants. This corpus-based study focuses on word-initial epenthesis. It provides quantitative evidence that vowel epenthesis applies categorically in some environments and variably in others. Probabilistic analysis demonstrates that the variable pattern is constrained by a complex interplay of linguistic factors. Following Labov (1972a, 1972b) and Antilla and Cho (1998), I interpret such intricate grammatical conditioni
APA, Harvard, Vancouver, ISO, and other styles
30

Khoufi, Nabil, Chafik Aloulou, and Lamia Hadrich Belguith. "Arabic Probabilistic Context Free Grammar Induction from a Treebank." Research in Computing Science 90, no. 1 (2015): 77–86. http://dx.doi.org/10.13053/rcs-90-1-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ren, Gang, Zhe Wen, Xuchen Yang, Cheng Shu, Fangyu Ke, and Mark Bocko. "Representation of musical performance “grammar” using probabilistic graphical models." Journal of the Acoustical Society of America 134, no. 5 (2013): 3995. http://dx.doi.org/10.1121/1.4830573.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

vor der Brück, Tim. "A Probabilistic Approach to Error Detection&Correction for Tree-Mapping Grammars." Prague Bulletin of Mathematical Linguistics 111, no. 1 (2018): 97–112. http://dx.doi.org/10.2478/pralin-2018-0009.

Full text
Abstract:
Abstract Rule-based natural language generation denotes the process of converting a semantic input structure into a surface representation by means of a grammar. In the following, we assume that this grammar is handcrafted and not automatically created for instance by a deep neural network. Such a grammar might comprise of a large set of rules. A single error in these rules can already have a large impact on the quality of the generated sentences, potentially causing even a complete failure of the entire generation process. Searching for errors in these rules can be quite tedious and time-cons
APA, Harvard, Vancouver, ISO, and other styles
33

Long Zhu, Yuanhao Chen, and A. Yuille. "Unsupervised Learning of Probabilistic Grammar-Markov Models for Object Categories." IEEE Transactions on Pattern Analysis and Machine Intelligence 31, no. 1 (2009): 114–28. http://dx.doi.org/10.1109/tpami.2008.67.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Szmrecsanyi, Benedikt, Jason Grafmiller, Benedikt Heller, and Melanie Röthlisberger. "Around the world in three alternations." English World-Wide 37, no. 2 (2016): 109–37. http://dx.doi.org/10.1075/eww.37.2.01szm.

Full text
Abstract:
We sketch a project that marries probabilistic grammar research to scholarship on World Englishes, thus synthesizing two previously rather disjoint lines of research into one unifying project with a coherent focus. This synthesis is hoped to advance usage-based theoretical linguistics by adopting a large-scale comparative and sociolinguistically responsible perspective on grammatical variation. To highlight the descriptive and theoretical benefits of the approach, we present case studies of three syntactic alternations (the particle placement, genitive, and dative alternations) in four varieti
APA, Harvard, Vancouver, ISO, and other styles
35

Dimopoulos, Alexandros C., Christos Pavlatos, and George Papakonstantinou. "Hardware Inexact Grammar Parser." International Journal of Pattern Recognition and Artificial Intelligence 31, no. 11 (2017): 1759025. http://dx.doi.org/10.1142/s021800141759025x.

Full text
Abstract:
In this paper, a platform is presented, that given a Stochastic Context-Free Grammar (SCFG), automatically outputs the description of a parser in synthesizable Hardware Description Language (HDL) which can be downloaded in an FPGA (Field Programmable Gate Arrays) board. Although the proposed methodology can be used for various inexact models, the probabilistic model is analyzed in detail and the extension to other inexact schemes is described. Context-Free Grammars (CFG) are augmented with attributes which represent the probability values. Initially, a methodology is proposed based on the fact
APA, Harvard, Vancouver, ISO, and other styles
36

Goldsmith, John. "Unsupervised Learning of the Morphology of a Natural Language." Computational Linguistics 27, no. 2 (2001): 153–98. http://dx.doi.org/10.1162/089120101750300490.

Full text
Abstract:
This study reports the results of using minimum description length (MDL) analysis to model unsupervised learning of the morphological segmentation of European languages, using corpora ranging in size from 5,000 words to 500,000 words. We develop a set of heuristics that rapidly develop a probabilistic morphological grammar, and use MDL as our primary tool to determine whether the modifications proposed by the heuristics will be adopted or not. The resulting grammar matches well the analysis that would be developed by a human morphologist. In the final section, we discuss the relationship of th
APA, Harvard, Vancouver, ISO, and other styles
37

Nguyen, Dang Tuan, Kiet Van Nguyen, and Tin Trung Pham. "Implementing A Subcategorized Probabilistic Definite Clause Grammar for Vietnamese Sentence Parsing." International Journal on Natural Language Computing 2, no. 4 (2013): 1–19. http://dx.doi.org/10.5121/ijnlc.2013.2401.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Hasegawa, Yoshihiko, and Hitoshi Iba. "Estimation of Distribution Algorithm Based on Probabilistic Grammar with Latent Annotations." Transactions of the Japanese Society for Artificial Intelligence 23 (2008): 13–26. http://dx.doi.org/10.1527/tjsai.23.13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Mazur, Zygmunt, and Janusz Pec. "The Use of Context-Free Probabilistic Grammar to Anonymise Statistical Data." Cybernetics and Systems 51, no. 2 (2020): 177–91. http://dx.doi.org/10.1080/01969722.2019.1705551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Wong, Pak-Kan, Kwong-Sak Leung, and Man-Leung Wong. "Probabilistic grammar-based neuroevolution for physiological signal classification of ventricular tachycardia." Expert Systems with Applications 135 (November 2019): 237–48. http://dx.doi.org/10.1016/j.eswa.2019.06.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Chauve, Cedric, Julien Courtiel, and Yann Ponty. "Counting, Generating, Analyzing and Sampling Tree Alignments." International Journal of Foundations of Computer Science 29, no. 05 (2018): 741–67. http://dx.doi.org/10.1142/s0129054118420030.

Full text
Abstract:
Pairwise ordered tree alignment are combinatorial objects that appear in important applications, such as RNA secondary structure comparison. However, the usual representation of tree alignments as supertrees is ambiguous, i.e. two distinct supertrees may induce identical sets of matches between identical pairs of trees. This ambiguity is uninformative, and detrimental to any probabilistic analysis. In this work, we consider tree alignments up to equivalence. Our first result is a precise asymptotic enumeration of tree alignments, obtained from a context-free grammar by mean of basic analytic c
APA, Harvard, Vancouver, ISO, and other styles
42

Veras, Rafael, Christopher Collins, and Julie Thorpe. "A Large-Scale Analysis of the Semantic Password Model and Linguistic Patterns in Passwords." ACM Transactions on Privacy and Security 24, no. 3 (2021): 1–21. http://dx.doi.org/10.1145/3448608.

Full text
Abstract:
In this article, we present a thorough evaluation of semantic password grammars. We report multifactorial experiments that test the impact of sample size, probability smoothing, and linguistic information on password cracking. The semantic grammars are compared with state-of-the-art probabilistic context-free grammar ( PCFG ) and neural network models, and tested in cross-validation and A vs. B scenarios. We present results that reveal the contributions of part-of-speech (syntactic) and semantic patterns, and suggest that the former are more consequential to the security of passwords. Our resu
APA, Harvard, Vancouver, ISO, and other styles
43

Fraser, Alexander, Helmut Schmid, Richárd Farkas, Renjing Wang, and Hinrich Schütze. "Knowledge Sources for Constituent Parsing of German, a Morphologically Rich and Less-Configurational Language." Computational Linguistics 39, no. 1 (2013): 57–85. http://dx.doi.org/10.1162/coli_a_00135.

Full text
Abstract:
We study constituent parsing of German, a morphologically rich and less-configurational language. We use a probabilistic context-free grammar treebank grammar that has been adapted to the morphologically rich properties of German by markovization and special features added to its productions. We evaluate the impact of adding lexical knowledge. Then we examine both monolingual and bilingual approaches to parse reranking. Our reranking parser is the new state of the art in constituency parsing of the TIGER Treebank. We perform an analysis, concluding with lessons learned, which apply to parsing
APA, Harvard, Vancouver, ISO, and other styles
44

ten Cate, Carel, and Kazuo Okanoya. "Revisiting the syntactic abilities of non-human animals: natural vocalizations and artificial grammar learning." Philosophical Transactions of the Royal Society B: Biological Sciences 367, no. 1598 (2012): 1984–94. http://dx.doi.org/10.1098/rstb.2012.0055.

Full text
Abstract:
The domain of syntax is seen as the core of the language faculty and as the most critical difference between animal vocalizations and language. We review evidence from spontaneously produced vocalizations as well as from perceptual experiments using artificial grammars to analyse animal syntactic abilities, i.e. abilities to produce and perceive patterns following abstract rules. Animal vocalizations consist of vocal units (elements) that are combined in a species-specific way to create higher order strings that in turn can be produced in different patterns. While these patterns differ between
APA, Harvard, Vancouver, ISO, and other styles
45

SHIRAI, KIYOAKI, TAKENOBU TOKUNAGA, and HOZUMI TANAKA. "Automatic Extraction of Japanese Probabilistic Context Free Grammar From a Bracketed Corpus." Journal of Natural Language Processing 4, no. 1 (1997): 125–46. http://dx.doi.org/10.5715/jnlp.4.125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Bod, Rens. "From Exemplar to Grammar: A Probabilistic Analogy-Based Model of Language Learning." Cognitive Science 33, no. 5 (2009): 752–93. http://dx.doi.org/10.1111/j.1551-6709.2009.01031.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Demberg, Vera, Frank Keller, and Alexander Koller. "Incremental, Predictive Parsing with Psycholinguistically Motivated Tree-Adjoining Grammar." Computational Linguistics 39, no. 4 (2013): 1025–66. http://dx.doi.org/10.1162/coli_a_00160.

Full text
Abstract:
Psycholinguistic research shows that key properties of the human sentence processor are incrementality, connectedness (partial structures contain no unattached nodes), and prediction (upcoming syntactic structure is anticipated). There is currently no broad-coverage parsing model with these properties, however. In this article, we present the first broad-coverage probabilistic parser for PLTAG, a variant of TAG that supports all three requirements. We train our parser on a TAG-transformed version of the Penn Treebank and show that it achieves performance comparable to existing TAG parsers that
APA, Harvard, Vancouver, ISO, and other styles
48

Jarosz, Gaja. "Learning with hidden structure in Optimality Theory and Harmonic Grammar: beyond Robust Interpretive Parsing." Phonology 30, no. 1 (2013): 27–71. http://dx.doi.org/10.1017/s0952675713000031.

Full text
Abstract:
This paper explores the relative merits of constraint rankingvs. weighting in the context of a major outstanding learnability problem in phonology: learning in the face of hidden structure. Specifically, the paper examines a well-known approach to the structural ambiguity problem, Robust Interpretive Parsing (RIP; Tesar & Smolensky 1998), focusing on its stochastic extension first described by Boersma (2003). Two related problems with the stochastic formulation of RIP are revealed, rooted in a failure to take full advantage of probabilistic information available in the learner's grammar. T
APA, Harvard, Vancouver, ISO, and other styles
49

NAIGLES, LETITIA R. "Comprehension matters: a commentary on ‘A multiple process solution to the logical problem of language acquisition’." Journal of Child Language 31, no. 4 (2004): 936–40. http://dx.doi.org/10.1017/s0305000904006403.

Full text
Abstract:
MacWhinney (2004) has provided a clear and welcome synthesis of many strands of the recent research addressing the logical problem of first language acquisition from a non-nativist or non-generative grammar framework. The strand that I will comment on is the one MacWhinney calls the ‘pivot’ of his proposal, namely, that acquiring a grammar is primarily a function of learning ITEM-BASEDPATTERNS (e.g. pp. 23–29, 41, passim). These item-based patterns serve a number of dominant roles within MacWhinney's proposal, including enforcing children's conservatism (thereby reducing greatly their overgene
APA, Harvard, Vancouver, ISO, and other styles
50

Garcia, Guilherme D. "Weight gradience and stress in Portuguese." Phonology 34, no. 1 (2017): 41–79. http://dx.doi.org/10.1017/s0952675717000033.

Full text
Abstract:
This paper examines the role of weight in stress assignment in the Portuguese lexicon, and proposes a probabilistic approach to stress. I show that weight effects are gradient, and weaken monotonically as we move away from the right edge of the word. Such effects depend on the position of a syllable in the word, as well as on the number of segments the syllable contains. The probabilistic model proposed in this paper is based on a single predictor, namely weight, and yields more accurate results than a categorical analysis, where weight is treated as binary. Finally, I discuss implications for
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!