Artículos de revistas sobre el tema "Neural language models"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Neural language models".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Buckman, Jacob, and Graham Neubig. "Neural Lattice Language Models." Transactions of the Association for Computational Linguistics 6 (December 2018): 529–41. http://dx.doi.org/10.1162/tacl_a_00036.
Texto completoBengio, Yoshua. "Neural net language models." Scholarpedia 3, no. 1 (2008): 3881. http://dx.doi.org/10.4249/scholarpedia.3881.
Texto completoDong, Li. "Learning natural language interfaces with neural models." AI Matters 7, no. 2 (2021): 14–17. http://dx.doi.org/10.1145/3478369.3478375.
Texto completoDe Coster, Mathieu, and Joni Dambre. "Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation." Information 13, no. 5 (2022): 220. http://dx.doi.org/10.3390/info13050220.
Texto completoMandy Lau. "Artificial intelligence language models and the false fantasy of participatory language policies." Working papers in Applied Linguistics and Linguistics at York 1 (September 13, 2021): 4–15. http://dx.doi.org/10.25071/2564-2855.5.
Texto completoChang, Tyler A., and Benjamin K. Bergen. "Word Acquisition in Neural Language Models." Transactions of the Association for Computational Linguistics 10 (2022): 1–16. http://dx.doi.org/10.1162/tacl_a_00444.
Texto completoMezzoudj, Freha, and Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models." International Journal of Innovative Computing and Applications 9, no. 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.095762.
Texto completoMezzoudj, Freha, and Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models." International Journal of Innovative Computing and Applications 9, no. 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.10016827.
Texto completoQi, Kunxun, and Jianfeng Du. "Translation-Based Matching Adversarial Network for Cross-Lingual Natural Language Inference." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 8632–39. http://dx.doi.org/10.1609/aaai.v34i05.6387.
Texto completoAngius, Nicola, Pietro Perconti, Alessio Plebe, and Alessandro Acciai. "The Simulative Role of Neural Language Models in Brain Language Processing." Philosophies 9, no. 5 (2024): 137. http://dx.doi.org/10.3390/philosophies9050137.
Texto completoHale, John T., Luca Campanelli, Jixing Li, Shohini Bhattasali, Christophe Pallier, and Jonathan R. Brennan. "Neurocomputational Models of Language Processing." Annual Review of Linguistics 8, no. 1 (2022): 427–46. http://dx.doi.org/10.1146/annurev-linguistics-051421-020803.
Texto completoKlemen, Matej, and Slavko Zitnik. "Neural coreference resolution for Slovene language." Computer Science and Information Systems, no. 00 (2021): 60. http://dx.doi.org/10.2298/csis201120060k.
Texto completoLytvynov, A., P. Andreicheva, V. Bredikhin, and V. Verbytska. "DEVELOPMENT TENDENCIES OF GENERATION MODELS OF THE UKRAINIAN LANGUAGE." Municipal economy of cities 3, no. 184 (2024): 10–15. http://dx.doi.org/10.33042/2522-1809-2024-3-184-10-15.
Texto completoPark, Myung-Kwan, Keonwoo Koo, Jaemin Lee, and Wonil Chung. "Investigating Syntactic Transfer from English to Korean in Neural L2 Language Models." Studies in Modern Grammar 121 (March 30, 2024): 177–201. http://dx.doi.org/10.14342/smog.2024.121.177.
Texto completoKunchukuttan, Anoop, Mitesh Khapra, Gurneet Singh, and Pushpak Bhattacharyya. "Leveraging Orthographic Similarity for Multilingual Neural Transliteration." Transactions of the Association for Computational Linguistics 6 (December 2018): 303–16. http://dx.doi.org/10.1162/tacl_a_00022.
Texto completoBayer, Ali Orkan, and Giuseppe Riccardi. "Semantic language models with deep neural networks." Computer Speech & Language 40 (November 2016): 1–22. http://dx.doi.org/10.1016/j.csl.2016.04.001.
Texto completoChuchupal, V. Y. "Neural language models for automatic speech Recognition." Речевые технологии, no. 1-2 (2020): 27–47. http://dx.doi.org/10.58633/2305-8129_2020_1-2_27.
Texto completoTian, Yijun, Huan Song, Zichen Wang, et al. "Graph Neural Prompting with Large Language Models." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 17 (2024): 19080–88. http://dx.doi.org/10.1609/aaai.v38i17.29875.
Texto completoSchomacker, Thorben, and Marina Tropmann-Frick. "Language Representation Models: An Overview." Entropy 23, no. 11 (2021): 1422. http://dx.doi.org/10.3390/e23111422.
Texto completoTakahashi, Shuntaro, and Kumiko Tanaka-Ishii. "Evaluating Computational Language Models with Scaling Properties of Natural Language." Computational Linguistics 45, no. 3 (2019): 481–513. http://dx.doi.org/10.1162/coli_a_00355.
Texto completoOba, Miyu. "Research Background on Second Language Acquisition in Neural Language Models." Journal of Natural Language Processing 32, no. 2 (2025): 684–90. https://doi.org/10.5715/jnlp.32.684.
Texto completoMartin, Andrea E. "A Compositional Neural Architecture for Language." Journal of Cognitive Neuroscience 32, no. 8 (2020): 1407–27. http://dx.doi.org/10.1162/jocn_a_01552.
Texto completoMukhamadiyev, Abdinabi, Mukhriddin Mukhiddinov, Ilyos Khujayarov, Mannon Ochilov, and Jinsoo Cho. "Development of Language Models for Continuous Uzbek Speech Recognition System." Sensors 23, no. 3 (2023): 1145. http://dx.doi.org/10.3390/s23031145.
Texto completoNaveenkumar, T. Rudrappa, V. Reddy Mallamma, and Hanumanthappa M. "KHiTE: Multilingual Speech Acquisition to Monolingual Text Translation." Indian Journal of Science and Technology 16, no. 21 (2023): 1572–79. https://doi.org/10.17485/IJST/v16i21.727.
Texto completoPenner, Regina V. "Large Language Models: А Socio-Philosophical Essay". Galactica Media: Journal of Media Studies 6, № 3 (2024): 83–100. http://dx.doi.org/10.46539/gmd.v6i3.502.
Texto completoHafeez, Rabab, Muhammad Waqas Anwar, Muhammad Hasan Jamal, et al. "Contextual Urdu Lemmatization Using Recurrent Neural Network Models." Mathematics 11, no. 2 (2023): 435. http://dx.doi.org/10.3390/math11020435.
Texto completoOralbekova, Dina, Orken Mamyrbayev, Mohamed Othman, Dinara Kassymova, and Kuralai Mukhsina. "Contemporary Approaches in Evolving Language Models." Applied Sciences 13, no. 23 (2023): 12901. http://dx.doi.org/10.3390/app132312901.
Texto completoYogatama, Dani, Cyprien de Masson d’Autume, and Lingpeng Kong. "Adaptive Semiparametric Language Models." Transactions of the Association for Computational Linguistics 9 (2021): 362–73. http://dx.doi.org/10.1162/tacl_a_00371.
Texto completoConstantinescu, Ionut, Tiago Pimentel, Ryan Cotterell, and Alex Warstadt. "Investigating Critical Period Effects in Language Acquisition through Neural Language Models." Transactions of the Association for Computational Linguistics 13 (January 24, 2024): 96–120. https://doi.org/10.1162/tacl_a_00725.
Texto completoTinn, Robert, Hao Cheng, Yu Gu, et al. "Fine-tuning large neural language models for biomedical natural language processing." Patterns 4, no. 4 (2023): 100729. http://dx.doi.org/10.1016/j.patter.2023.100729.
Texto completoChoi, Sunjoo, Myung-Kwan Park, and Euhee Kim. "How are Korean Neural Language Models ‘surprised’ Layerwisely?" Journal of Language Sciences 28, no. 4 (2021): 301–17. http://dx.doi.org/10.14384/kals.2021.28.4.301.
Texto completoZhang, Peng, Wenjie Hui, Benyou Wang, et al. "Complex-valued Neural Network-based Quantum Language Models." ACM Transactions on Information Systems 40, no. 4 (2022): 1–31. http://dx.doi.org/10.1145/3505138.
Texto completoLee, Jaemin, and Jeong-Ah Shin. "Evaluating L2 Training Methods in Neural Language Models." Lanaguage Research 60, no. 3 (2024): 323–45. https://doi.org/10.30961/lr.2024.60.3.323.
Texto completoTanaka, Tomohiro, Ryo Masumura, and Takanobu Oba. "Neural candidate-aware language models for speech recognition." Computer Speech & Language 66 (March 2021): 101157. http://dx.doi.org/10.1016/j.csl.2020.101157.
Texto completoKong, Weirui, Hyeju Jang, Giuseppe Carenini, and Thalia S. Field. "Exploring neural models for predicting dementia from language." Computer Speech & Language 68 (July 2021): 101181. http://dx.doi.org/10.1016/j.csl.2020.101181.
Texto completoPhan, Tien D., and Nur Zincir‐Heywood. "User identification via neural network based language models." International Journal of Network Management 29, no. 3 (2018): e2049. http://dx.doi.org/10.1002/nem.2049.
Texto completoKaryukin, Vladislav, Diana Rakhimova, Aidana Karibayeva, Aliya Turganbayeva, and Asem Turarbek. "The neural machine translation models for the low-resource Kazakh–English language pair." PeerJ Computer Science 9 (February 8, 2023): e1224. http://dx.doi.org/10.7717/peerj-cs.1224.
Texto completoLai, Yihan. "Enhancing Linguistic Bridges: Seq2seq Models and the Future of Machine Translation." Highlights in Science, Engineering and Technology 111 (August 19, 2024): 410–14. https://doi.org/10.54097/pf2xsr76.
Texto completoBudaya, I. Gede Bintang Arya, Made Windu Antara Kesiman, and I. Made Gede Sunarya. "The Influence of Word Vectorization for Kawi Language to Indonesian Language Neural Machine Translation." Journal of Information Technology and Computer Science 7, no. 1 (2022): 81–93. http://dx.doi.org/10.25126/jitecs.202271387.
Texto completoStudenikina, Kseniia Andreevna. "Evaluation of neural models’ linguistic competence: evidence from Russian predicate agreement." Proceedings of the Institute for System Programming of the RAS 34, no. 6 (2022): 178–84. http://dx.doi.org/10.15514/ispras-2022-34(6)-14.
Texto completoMeijer, Erik. "Virtual Machinations: Using Large Language Models as Neural Computers." Queue 22, no. 3 (2024): 25–52. http://dx.doi.org/10.1145/3676287.
Texto completoRane, Kirti, Tanaya Bagwe,, Shruti Chaudhari, Ankita Kale, and Gayatri Deore. "Enhancing En-X Translation: A Chrome Extension-Based Approach to Indic Language Models." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 03 (2025): 1–9. https://doi.org/10.55041/ijsrem42782.
Texto completoGoldberg, Yoav. "A Primer on Neural Network Models for Natural Language Processing." Journal of Artificial Intelligence Research 57 (November 20, 2016): 345–420. http://dx.doi.org/10.1613/jair.4992.
Texto completoZhao, Xiaodong, Rouyi Fan, and Wanyue Liu. "Research on Transformer-Based Multilingual Machine Translation Methods." Journal of Intelligence and Knowledge Engineering 3, no. 1 (2025): 57–67. https://doi.org/10.62517/jike.202504108.
Texto completoDeepak Mane. "Transformer based Neural Network Architecturefor Regional Language Translation." Advances in Nonlinear Variational Inequalities 28, no. 3s (2024): 211–25. https://doi.org/10.52783/anvi.v28.2925.
Texto completoJabar, H. Yousif. "Neural Computing based Part of Speech Tagger for Arabic Language: A review study." International Journal of Computation and Applied Sciences IJOCAAS 1, no. 5 (2020): 361–65. https://doi.org/10.5281/zenodo.4002418.
Texto completoWu, Yi-Chao, Fei Yin, and Cheng-Lin Liu. "Improving handwritten Chinese text recognition using neural network language models and convolutional neural network shape models." Pattern Recognition 65 (May 2017): 251–64. http://dx.doi.org/10.1016/j.patcog.2016.12.026.
Texto completoBabić, Karlo, Sanda Martinčić-Ipšić, and Ana Meštrović. "Survey of Neural Text Representation Models." Information 11, no. 11 (2020): 511. http://dx.doi.org/10.3390/info11110511.
Texto completoMuhammad, Murad, Shahzad Muhammad, and Fareed Naheeda. "Research Comparative Analysis of OCR Models for Urdu Language Characters Recognition." LC International Journal of STEM 5, no. 3 (2024): 55–63. https://doi.org/10.5281/zenodo.14028816.
Texto completoHahn, Michael. "Theoretical Limitations of Self-Attention in Neural Sequence Models." Transactions of the Association for Computational Linguistics 8 (July 2020): 156–71. http://dx.doi.org/10.1162/tacl_a_00306.
Texto completo