Literatura académica sobre el tema "N-gram language models"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "N-gram language models".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "N-gram language models"
LLORENS, DAVID, JUAN MIGUEL VILAR y FRANCISCO CASACUBERTA. "FINITE STATE LANGUAGE MODELS SMOOTHED USING n-GRAMS". International Journal of Pattern Recognition and Artificial Intelligence 16, n.º 03 (mayo de 2002): 275–89. http://dx.doi.org/10.1142/s0218001402001666.
Texto completoMEMUSHAJ, ALKET y TAREK M. SOBH. "USING GRAPHEME n-GRAMS IN SPELLING CORRECTION AND AUGMENTATIVE TYPING SYSTEMS". New Mathematics and Natural Computation 04, n.º 01 (marzo de 2008): 87–106. http://dx.doi.org/10.1142/s1793005708000970.
Texto completoMezzoudj, Freha y Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models". International Journal of Innovative Computing and Applications 9, n.º 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.095762.
Texto completoMezzoudj, Freha y Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models". International Journal of Innovative Computing and Applications 9, n.º 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.10016827.
Texto completoTakase, Sho, Jun Suzuki y Masaaki Nagata. "Character n-Gram Embeddings to Improve RNN Language Models". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17 de julio de 2019): 5074–82. http://dx.doi.org/10.1609/aaai.v33i01.33015074.
Texto completoSantos, André L., Gonçalo Prendi, Hugo Sousa y Ricardo Ribeiro. "Stepwise API usage assistance using n -gram language models". Journal of Systems and Software 131 (septiembre de 2017): 461–74. http://dx.doi.org/10.1016/j.jss.2016.06.063.
Texto completoNederhof, Mark-Jan. "A General Technique to Train Language Models on Language Models". Computational Linguistics 31, n.º 2 (junio de 2005): 173–85. http://dx.doi.org/10.1162/0891201054223986.
Texto completoCrego, Josep M. y François Yvon. "Factored bilingual n-gram language models for statistical machine translation". Machine Translation 24, n.º 2 (junio de 2010): 159–75. http://dx.doi.org/10.1007/s10590-010-9082-5.
Texto completoLin, Jimmy y W. John Wilbur. "Modeling actions of PubMed users with n-gram language models". Information Retrieval 12, n.º 4 (12 de septiembre de 2008): 487–503. http://dx.doi.org/10.1007/s10791-008-9067-7.
Texto completoGUO, YUQING, HAIFENG WANG y JOSEF VAN GENABITH. "Dependency-based n-gram models for general purpose sentence realisation". Natural Language Engineering 17, n.º 4 (29 de noviembre de 2010): 455–83. http://dx.doi.org/10.1017/s1351324910000288.
Texto completoTesis sobre el tema "N-gram language models"
Kulhanek, Raymond Daniel. "A Latent Dirichlet Allocation/N-gram Composite Language Model". Wright State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=wright1379520876.
Texto completoZhou, Hanqing. "DBpedia Type and Entity Detection Using Word Embeddings and N-gram Models". Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/37324.
Texto completoMehay, Dennis Nolan. "Bean Soup Translation: Flexible, Linguistically-motivated Syntax for Machine Translation". The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1345433807.
Texto completoEbadat, Ali-Reza. "Toward Robust Information Extraction Models for Multimedia Documents". Phd thesis, INSA de Rennes, 2012. http://tel.archives-ouvertes.fr/tel-00760383.
Texto completoJiang, Yuandong. "Large Scale Distributed Semantic N-gram Language Model". Wright State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=wright1316200173.
Texto completoHannemann, Mirko. "Rozpoznávácí sítě založené na konečných stavových převodnících pro dopředné a zpětné dekódování v rozpoznávání řeči". Doctoral thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2016. http://www.nusl.cz/ntk/nusl-412550.
Texto completoO'Boyle, Peter L. "A study of an N-gram language model for speech recognition". Thesis, Queen's University Belfast, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.333827.
Texto completoCivera, Saiz Jorge. "Novel statistical approaches to text classification, machine translation and computer-assisted translation". Doctoral thesis, Universitat Politècnica de València, 2008. http://hdl.handle.net/10251/2502.
Texto completoCivera Saiz, J. (2008). Novel statistical approaches to text classification, machine translation and computer-assisted translation [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/2502
Palancia
Randák, Richard. "N-Grams as a Measure of Naturalness and Complexity". Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-90006.
Texto completoGangireddy, Siva Reddy. "Recurrent neural network language models for automatic speech recognition". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28990.
Texto completoLibros sobre el tema "N-gram language models"
Voutilainen, Atro. Part-of-Speech Tagging. Editado por Ruslan Mitkov. Oxford University Press, 2012. http://dx.doi.org/10.1093/oxfordhb/9780199276349.013.0011.
Texto completoCapítulos de libros sobre el tema "N-gram language models"
Hecht, Robert, Jürgen Riedler y Gerhard Backfried. "Fitting German into N-Gram Language Models". En Text, Speech and Dialogue, 341–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-46154-x_49.
Texto completoPopel, Martin y David Mareček. "Perplexity of n-Gram and Dependency Language Models". En Text, Speech and Dialogue, 173–80. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15760-8_23.
Texto completoAbdallah, Tarek Amr y Beatriz de La Iglesia. "URL-Based Web Page Classification: With n-Gram Language Models". En Communications in Computer and Information Science, 19–33. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25840-9_2.
Texto completoPeng, Fuchun y Dale Schuurmans. "Combining Naive Bayes and n-Gram Language Models for Text Classification". En Lecture Notes in Computer Science, 335–50. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/3-540-36618-0_24.
Texto completoVarjokallio, Matti, Mikko Kurimo y Sami Virpioja. "Class n-Gram Models for Very Large Vocabulary Speech Recognition of Finnish and Estonian". En Statistical Language and Speech Processing, 133–44. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-45925-7_11.
Texto completoHuang, Xiangji, Fuchun Peng, Aijun An, Dale Schuurmans y Nick Cercone. "Session Boundary Detection for Association Rule Learning Using n-Gram Language Models". En Advances in Artificial Intelligence, 237–51. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/3-540-44886-1_19.
Texto completoChang, Harry. "Enriching Domain-Specific Language Models Using Domain Independent WWW N-Gram Corpus". En Artificial Intelligence and Soft Computing, 38–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29350-4_5.
Texto completoAceves-Pérez, Rita M., Luis Villaseñor-Pineda y Manuel Montes-y-Gómez. "Using N-Gram Models to Combine Query Translations in Cross-Language Question Answering". En Computational Linguistics and Intelligent Text Processing, 453–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11671299_47.
Texto completoPakoci, Edvin y Branislav Popović. "Methods for Using Class Based N-gram Language Models in the Kaldi Toolkit". En Speech and Computer, 492–503. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87802-3_45.
Texto completoHamed, Injy, Mohamed Elmahdy y Slim Abdennadher. "Expanding N-grams for Code-Switch Language Models". En Advances in Intelligent Systems and Computing, 221–29. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99010-1_20.
Texto completoActas de conferencias sobre el tema "N-gram language models"
Hirsimaki, Teemu. "On Compressing N-Gram Language Models". En 2007 IEEE International Conference on Acoustics, Speech, and Signal Processing. IEEE, 2007. http://dx.doi.org/10.1109/icassp.2007.367228.
Texto completoSak, Hasim, Cyril Allauzen, Kaisuke Nakajima y Francoise Beaufays. "Mixture of mixture n-gram language models". En 2013 IEEE Workshop on Automatic Speech Recognition & Understanding (ASRU). IEEE, 2013. http://dx.doi.org/10.1109/asru.2013.6707701.
Texto completoChen, Mingqing, Ananda Theertha Suresh, Rajiv Mathews, Adeline Wong, Cyril Allauzen, Françoise Beaufays y Michael Riley. "Federated Learning of N-Gram Language Models". En Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/k19-1012.
Texto completoBickel, Steffen, Peter Haider y Tobias Scheffer. "Predicting sentences using N-gram language models". En the conference. Morristown, NJ, USA: Association for Computational Linguistics, 2005. http://dx.doi.org/10.3115/1220575.1220600.
Texto completoRastrow, Ariya, Abhinav Sethy y Bhuvana Ramabhadran. "Constrained discriminative training of N-gram language models". En Understanding (ASRU). IEEE, 2009. http://dx.doi.org/10.1109/asru.2009.5373338.
Texto completoHuang, Ruizhe, Ke Li, Ashish Arora, Daniel Povey y Sanjeev Khudanpur. "Efficient MDI Adaptation for n-Gram Language Models". En Interspeech 2020. ISCA: ISCA, 2020. http://dx.doi.org/10.21437/interspeech.2020-2909.
Texto completoKuznetsov, Vitaly, Hank Liao, Mehryar Mohri, Michael Riley y Brian Roark. "Learning N-Gram Language Models from Uncertain Data". En Interspeech 2016. ISCA, 2016. http://dx.doi.org/10.21437/interspeech.2016-1093.
Texto completoHuang, Songfang y Steve Renals. "Power law discounting for n-gram language models". En 2010 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2010. http://dx.doi.org/10.1109/icassp.2010.5495007.
Texto completoBogoychev, Nikolay y Adam Lopez. "N-gram language models for massively parallel devices". En Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2016. http://dx.doi.org/10.18653/v1/p16-1183.
Texto completoWang, Song, Devin Chollak, Dana Movshovitz-Attias y Lin Tan. "Bugram: bug detection with n-gram language models". En ASE'16: ACM/IEEE International Conference on Automated Software Engineering. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2970276.2970341.
Texto completo