Academic literature on the topic 'Probabilistic grammar'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Probabilistic grammar.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Probabilistic grammar"

1

Nitay, Dolav, Dana Fisman, and Michal Ziv-Ukelson. "Learning of Structurally Unambiguous Probabilistic Grammars." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 10 (2021): 9170–78. http://dx.doi.org/10.1609/aaai.v35i10.17107.

Full text
Abstract:
The problem of identifying a probabilistic context free grammar has two aspects: the first is determining the grammar's topology (the rules of the grammar) and the second is estimating probabilistic weights for each rule. Given the hardness results for learning context-free grammars in general, and probabilistic grammars in particular, most of the literature has concentrated on the second problem. In this work we address the first problem. We restrict attention to structurally unambiguous weighted context-free grammars (SUWCFG) and provide a query learning algorithm for strucuturally unambiguo
APA, Harvard, Vancouver, ISO, and other styles
2

KROTOV, ALEXANDER, MARK HEPPLE, ROBERT GAIZAUSKAS, and YORICK WILKS. "Evaluating two methods for Treebank grammar compaction." Natural Language Engineering 5, no. 4 (1999): 377–94. http://dx.doi.org/10.1017/s1351324900002308.

Full text
Abstract:
Treebanks, such as the Penn Treebank, provide a basis for the automatic creation of broad coverage grammars. In the simplest case, rules can simply be ‘read off’ the parse-annotations of the corpus, producing either a simple or probabilistic context-free grammar. Such grammars, however, can be very large, presenting problems for the subsequent computational costs of parsing under the grammar. In this paper, we explore ways by which a treebank grammar can be reduced in size or ‘compacted’, which involve the use of two kinds of technique: (i) thresholding of rules by their number of occurrences;
APA, Harvard, Vancouver, ISO, and other styles
3

Benedikt Szmrecsanyi. "Diachronic Probabilistic Grammar." English Language and Linguistics 19, no. 3 (2013): 41–68. http://dx.doi.org/10.17960/ell.2013.19.3.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Daland, Robert. "Long words in maximum entropy phonotactic grammars." Phonology 32, no. 3 (2015): 353–83. http://dx.doi.org/10.1017/s0952675715000251.

Full text
Abstract:
A phonotactic grammar assigns a well-formedness score to all possible surface forms. This paper considers whether phonotactic grammars should be probabilistic, and gives several arguments that they need to be. Hayes & Wilson (2008) demonstrate the promise of a maximum entropy Harmonic Grammar as a probabilistic phonotactic grammar. This paper points out a theoretical issue with maxent phonotactic grammars: they are not guaranteed to assign a well-defined probability distribution, because sequences that contain arbitrary repetitions of unmarked sequences may be underpenalised. The paper mot
APA, Harvard, Vancouver, ISO, and other styles
5

Shih, Stephanie S. "Constraint conjunction in weighted probabilistic grammar." Phonology 34, no. 2 (2017): 243–68. http://dx.doi.org/10.1017/s0952675717000136.

Full text
Abstract:
This paper examines a key difference between constraint conjunction and constraint weight additivity, arguing that the two do not have the same empirical coverage. In particular, constraint conjunction in weighted probabilistic grammar allows for superadditive constraint interaction, where the effect of violating two constraints goes beyond the additive combination of the two constraints’ weights alone. A case study from parasitic tone harmony in Dioula d'Odienné demonstrates superadditive local and long-distance segmental feature similarities that increase the likelihood of tone harmony. Supe
APA, Harvard, Vancouver, ISO, and other styles
6

CASACUBERTA, FRANCISCO. "GROWTH TRANSFORMATIONS FOR PROBABILISTIC FUNCTIONS OF STOCHASTIC GRAMMARS." International Journal of Pattern Recognition and Artificial Intelligence 10, no. 03 (1996): 183–201. http://dx.doi.org/10.1142/s0218001496000153.

Full text
Abstract:
Stochastic Grammars are the most usual models in Syntactic Pattern Recognition. Both components of a Stochastic Grammar, the characteristic grammar and the probabilities attached to the rules, can be learnt automatically from training samples. In this paper, first a review of some algorithms are presented to infer the probabilistic component of Stochastic Regular and Context-Free Grammars under the framework of the Growth Transformations. On the other hand, with Stochastic Grammars, the patterns must be represented as strings over a finite set of symbols. However, the most natural representati
APA, Harvard, Vancouver, ISO, and other styles
7

Han, Young S., and Key-Sun Choi. "Best parse parsing with Earley's and Inside algorithms on probabilistic RTN." Natural Language Engineering 1, no. 2 (1995): 147–61. http://dx.doi.org/10.1017/s1351324900000127.

Full text
Abstract:
AbstractInside parsing is a best parse parsing method based on the Inside algorithm that is often used in estimating probabilistic parameters of stochastic context free grammars. It gives a best parse in O(N3G3) time where N is the input size and G is the grammar size. Earley algorithm can be made to return best parses with the same complexity in N.By way of experiments, we show that Inside parsing can be more efficient than Earley parsing with sufficiently large grammar and sufficiently short input sentences. For instance, Inside parsing is better with sentences of 16 or less words for a gram
APA, Harvard, Vancouver, ISO, and other styles
8

Kita, Kenji. "Mixture Probabilistic Context-Free Grammar." Journal of Natural Language Processing 3, no. 4 (1996): 103–13. http://dx.doi.org/10.5715/jnlp.3.4_103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

DAI, Yin-Tang, Cheng-Rong WU, Sheng-Xiang MA, and Yi-Ping ZHONG. "Hierarchically Classified Probabilistic Grammar Parsing." Journal of Software 22, no. 2 (2011): 245–57. http://dx.doi.org/10.3724/sp.j.1001.2011.03809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Arthi, K., and Kamala Krithivasan. "Probabilistic Parallel Communicating Grammar Systems." International Journal of Computer Mathematics 79, no. 1 (2002): 1–26. http://dx.doi.org/10.1080/00207160211914.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Probabilistic grammar"

1

Kwiatkowski, Thomas Mieczyslaw. "Probabilistic grammar induction from sentences and structured meanings." Thesis, University of Edinburgh, 2012. http://hdl.handle.net/1842/6190.

Full text
Abstract:
The meanings of natural language sentences may be represented as compositional logical-forms. Each word or lexicalised multiword-element has an associated logicalform representing its meaning. Full sentential logical-forms are then composed from these word logical-forms via a syntactic parse of the sentence. This thesis develops two computational systems that learn both the word-meanings and parsing model required to map sentences onto logical-forms from an example corpus of (sentence, logical-form) pairs. One of these systems is designed to provide a general purpose method of inducing semanti
APA, Harvard, Vancouver, ISO, and other styles
2

Stüber, Torsten. "Consistency of Probabilistic Context-Free Grammars." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2012. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-86943.

Full text
Abstract:
We present an algorithm for deciding whether an arbitrary proper probabilistic context-free grammar is consistent, i.e., whether the probability that a derivation terminates is one. Our procedure has time complexity $\\\\mathcal O(n^3)$ in the unit-cost model of computation. Moreover, we develop a novel characterization of consistent probabilistic context-free grammars. A simple corollary of our result is that training methods for probabilistic context-free grammars that are based on maximum-likelihood estimation always yield consistent grammars.
APA, Harvard, Vancouver, ISO, and other styles
3

Afrin, Taniza. "Extraction of Basic Noun Phrases from Natural Language Using Statistical Context-Free Grammar." Thesis, Virginia Tech, 2001. http://hdl.handle.net/10919/33353.

Full text
Abstract:
The objective of this research was to extract simple noun phrases from natural language texts using two different grammars: stochastic context-free grammar (SCFG) and non-statistical context free grammar (CFG). Precision and recall were calculated to determine how many precise and correct noun phrases were extracted using these two grammars. Several text files containing sentences from English natural language specifications were analyzed manually to obtain the test-set of simple noun-phrases. To obtain precision and recall, this test-set of manually extracted noun phrases was compared with th
APA, Harvard, Vancouver, ISO, and other styles
4

Hsu, Hsin-jen. "A neurophysiological study on probabilistic grammatical learning and sentence processing." Diss., University of Iowa, 2009. https://ir.uiowa.edu/etd/243.

Full text
Abstract:
Syntactic anomalies reliably elicit P600 effects in natural language processing. A survey of previous work converged on a conclusion that the mean amplitude of the P600 seems to be associated with the goodness of fit of a target word with expectation generated based on already unfolded materials. Based on this characteristic of the P600 effects, the current study aimed to look for evidence indicating the influence of input statistics in shaping grammatical knowledge/representations, and as a result leading to probabilistically-based competition/expectation generation processes of online senten
APA, Harvard, Vancouver, ISO, and other styles
5

Brookes, James William Rowe. "Probabilistic and multivariate modelling in Latin grammar : the participle-auxiliary alternation as a case study." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/probabilistic-and-multivariate-modelling-in-latin-grammar-the-participleauxiliary-alternation-as-a-case-study(4ff5b912-c410-41f2-94f2-859eb1ce5b21).html.

Full text
Abstract:
Recent research has shown that language is sensitive to probabilities and a whole host of multivariate conditioning factors. However, most of the research in this arena centres on the grammar of English, and, as yet, there is no statistical modelling on the grammar of Latin, studies of which have to date been largely philological. The rise in advanced statistical methodologies allows us to capture the underlying structure of the rich datasets which this corpus only language can potentially offer. This thesis intends to remedy this deficit by applying probabilistic and multivariate models to a
APA, Harvard, Vancouver, ISO, and other styles
6

Buys, Jan Moolman. "Probabilistic tree transducers for grammatical error correction." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/85592.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2013.<br>ENGLISH ABSTRACT: We investigate the application of weighted tree transducers to correcting grammatical errors in natural language. Weighted finite-state transducers (FST) have been used successfully in a wide range of natural language processing (NLP) tasks, even though the expressiveness of the linguistic transformations they perform is limited. Recently, there has been an increase in the use of weighted tree transducers and related formalisms that can express syntax-based natural language transformations in a probabilistic setting. The
APA, Harvard, Vancouver, ISO, and other styles
7

Shan, Yin Information Technology &amp Electrical Engineering Australian Defence Force Academy UNSW. "Program distribution estimation with grammar models." Awarded by:University of New South Wales - Australian Defence Force Academy. School of Information Technology and Electrical Engineering, 2005. http://handle.unsw.edu.au/1959.4/38737.

Full text
Abstract:
This thesis studies grammar-based approaches in the application of Estimation of Distribution Algorithms (EDA) to the tree representation widely used in Genetic Programming (GP). Although EDA is becoming one of the most active fields in Evolutionary computation (EC), the solution representation in most EDA is a Genetic Algorithms (GA) style linear representation. The more complex tree representations, resembling GP, have received only limited exploration. This is unfortunate, because tree representations provide a natural and expressive way of representing solutions for many problems. This th
APA, Harvard, Vancouver, ISO, and other styles
8

Pinnow, Eleni. "The role of probabilistic phonotactics in the recognition of reduced pseudowords." Diss., Online access via UMI:, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mora, Randall P., and Jerry L. Hill. "Service-Based Approach for Intelligent Agent Frameworks." International Foundation for Telemetering, 2011. http://hdl.handle.net/10150/595661.

Full text
Abstract:
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada<br>This paper describes a service-based Intelligent Agent (IA) approach for machine learning and data mining of distributed heterogeneous data streams. We focus on an open architecture framework that enables the programmer/analyst to build an IA suite for mining, examining and evaluating heterogeneous data for semantic representations, while iteratively building the probabilistic model in real-time to improve
APA, Harvard, Vancouver, ISO, and other styles
10

Torres, Parra Jimena Cecilia. "A Perception Based Question-Answering Architecture Derived from Computing with Words." Available to subscribers only, 2009. http://proquest.umi.com/pqdweb?did=1967797581&sid=1&Fmt=2&clientId=1509&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Probabilistic grammar"

1

C, Bunt Harry, and Nijholt Anton 1946-, eds. Advances in probabilistic and other parsing technologies. Kluwer Academic Publishers, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bunt, Harry. Advances in Probabilistic and Other Parsing Technologies. Springer Netherlands, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Liang, Percy, Michael Jordan, and Dan Klein. Probabilistic grammars and hierarchical Dirichlet processes. Edited by Anthony O'Hagan and Mike West. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198703174.013.27.

Full text
Abstract:
This article focuses on the use of probabilistic context-free grammars (PCFGs) in natural language processing involving a large-scale natural language parsing task. It describes detailed, highly-structured Bayesian modelling in which model dimension and complexity responds naturally to observed data. The framework, termed hierarchical Dirichlet process probabilistic context-free grammar (HDP-PCFG), involves structured hierarchical Dirichlet process modelling and customized model fitting via variational methods to address the problem of syntactic parsing and the underlying problems of grammar i
APA, Harvard, Vancouver, ISO, and other styles
4

(Editor), H. Bunt, and Anton Nijholt (Editor), eds. Advances in Probabilistic and Other Parsing Technologies (Text, Speech and Language Technology, Volume 16) (Text, Speech and Language Technology). Springer, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dresher, B. Elan, and Harry van der Hulst, eds. The Oxford History of Phonology. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780198796800.001.0001.

Full text
Abstract:
This volume is an up-to-date history of phonology from the earliest known examples of phonological thinking through the rise of phonology as a field in the 20th century and up to the present time. The volume is divided into five parts. Part I, Early insights in phonology, begins with writing systems and has chapters devoted to the great ancient and medieval intellectual traditions of phonological thought that form the foundation of later thinking and continue to enrich phonological theory. Part II, The founders of phonology, describes the important schools and individuals of the late nineteent
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Probabilistic grammar"

1

Kanchan Devi, K., and S. Arumugam. "Probabilistic Conjunctive Grammar." In Theoretical Computer Science and Discrete Mathematics. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-64419-6_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wong, Pak-Kan, Man-Leung Wong, and Kwong-Sak Leung. "Learning Grammar Rules in Probabilistic Grammar-Based Genetic Programming." In Theory and Practice of Natural Computing. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-49001-4_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Eshghi, Arash, Matthew Purver, Julian Hough, and Yo Sato. "Probabilistic Grammar Induction in an Incremental Semantic Framework." In Constraint Solving and Language Processing. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41578-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Araujo, L. "Evolutionary Parsing for a Probabilistic Context Free Grammar." In Rough Sets and Current Trends in Computing. Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-45554-x_74.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kim, Hyun-Tae, and Chang Wook Ahn. "A New Grammatical Evolution Based on Probabilistic Context-free Grammar." In Proceedings in Adaptation, Learning and Optimization. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-13356-0_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Houshmand, Shiva, and Sudhir Aggarwal. "Using Personal Information in Targeted Grammar-Based Probabilistic Password Attacks." In Advances in Digital Forensics XIII. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-67208-3_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Csuhaj-Varjú, Erzsébet, and Jürgen Dassow. "On the Size of Components of Probabilistic Cooperating Distributed Grammar Systems." In Theory Is Forever. Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-27812-2_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Goodman, Joshua. "Probabilistic Feature Grammars." In Text, Speech and Language Technology. Springer Netherlands, 2000. http://dx.doi.org/10.1007/978-94-015-9470-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mosbah, Mohamed. "Probabilistic graph grammars." In Graph-Theoretic Concepts in Computer Science. Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/3-540-56402-0_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Saranyadevi, S., R. Murugeswari, S. Bathrinath, and M. S. Sabitha. "Hybrid Association Rule Miner Using Probabilistic Context-Free Grammar and Ant Colony Optimization for Rainfall Prediction." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-16657-1_64.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Probabilistic grammar"

1

Kim, Yoon, Chris Dyer, and Alexander Rush. "Compound Probabilistic Context-Free Grammars for Grammar Induction." In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/p19-1228.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Naganuma, Hiroaki, Diptarama Hendrian, Ryo Yoshinaka, Ayumi Shinohara, and Naoki Kobayashi. "Grammar Compression with Probabilistic Context-Free Grammar." In 2020 Data Compression Conference (DCC). IEEE, 2020. http://dx.doi.org/10.1109/dcc47342.2020.00093.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Pu, Xiaoying, and Matthew Kay. "A Probabilistic Grammar of Graphics." In CHI '20: CHI Conference on Human Factors in Computing Systems. ACM, 2020. http://dx.doi.org/10.1145/3313831.3376466.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wong, Pak-Kan, Man-Leung Wong, and Kwong-Sak Leung. "Probabilistic grammar-based deep neuroevolution." In GECCO '19: Genetic and Evolutionary Computation Conference. ACM, 2019. http://dx.doi.org/10.1145/3319619.3326778.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Xiong, Hanwei, Jun Xu, Chenxi Xu, and Ming Pan. "Automating 3D reconstruction using a probabilistic grammar." In Applied Optics and Photonics China (AOPC2015), edited by Chunhua Shen, Weiping Yang, and Honghai Liu. SPIE, 2015. http://dx.doi.org/10.1117/12.2202966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Saparov, Abulhair, Vijay Saraswat, and Tom Mitchell. "A Probabilistic Generative Grammar for Semantic Parsing." In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017). Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/k17-1026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cekan, Ondrej, Jakub Podivinsky, and Zdenek Kotasek. "Program Generation Through a Probabilistic Constrained Grammar." In 2018 21st Euromicro Conference on Digital System Design (DSD). IEEE, 2018. http://dx.doi.org/10.1109/dsd.2018.00049.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kawabata, Takeshi. "Dynamic probabilistic grammar for spoken language disambiguation." In 3rd International Conference on Spoken Language Processing (ICSLP 1994). ISCA, 1994. http://dx.doi.org/10.21437/icslp.1994-211.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Devi, K. Kanchan, and S. Arumugam. "Password Cracking Algorithm using Probabilistic Conjunctive Grammar." In 2019 IEEE International Conference on Intelligent Techniques in Control, Optimization and Signal Processing (INCOS). IEEE, 2019. http://dx.doi.org/10.1109/incos45849.2019.8951390.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

"Probabilistic Regular Grammar Inference Algorithm Using Incremental Technique." In 2018 the 8th International Workshop on Computer Science and Engineering. WCSE, 2018. http://dx.doi.org/10.18178/wcse.2018.06.129.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Probabilistic grammar"

1

Lafferty, John, Saniel Sleator, and Davy Temperley. Grammatical Trigrams: A Probabilistic Model of Link Grammar. Defense Technical Information Center, 1992. http://dx.doi.org/10.21236/ada256365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!