Academic literature on the topic 'Neural Language Model'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Neural Language Model.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Neural Language Model"
Emami, Ahmad, and Frederick Jelinek. "A Neural Syntactic Language Model." Machine Learning 60, no. 1-3 (June 2, 2005): 195–227. http://dx.doi.org/10.1007/s10994-005-0916-y.
Full textBuckman, Jacob, and Graham Neubig. "Neural Lattice Language Models." Transactions of the Association for Computational Linguistics 6 (December 2018): 529–41. http://dx.doi.org/10.1162/tacl_a_00036.
Full textZhang, Yike, Pengyuan Zhang, and Yonghong Yan. "Tailoring an Interpretable Neural Language Model." IEEE/ACM Transactions on Audio, Speech, and Language Processing 27, no. 7 (July 2019): 1164–78. http://dx.doi.org/10.1109/taslp.2019.2913087.
Full textKunchukuttan, Anoop, Mitesh Khapra, Gurneet Singh, and Pushpak Bhattacharyya. "Leveraging Orthographic Similarity for Multilingual Neural Transliteration." Transactions of the Association for Computational Linguistics 6 (December 2018): 303–16. http://dx.doi.org/10.1162/tacl_a_00022.
Full textTang, Zhiyuan, Dong Wang, Yixiang Chen, Lantian Li, and Andrew Abel. "Phonetic Temporal Neural Model for Language Identification." IEEE/ACM Transactions on Audio, Speech, and Language Processing 26, no. 1 (January 2018): 134–44. http://dx.doi.org/10.1109/taslp.2017.2764271.
Full textSouri, Adnan, Mohammed Al Achhab, Badr Eddine Elmohajir, and Abdelali Zbakh. "Neural network dealing with Arabic language." International Journal of Informatics and Communication Technology (IJ-ICT) 9, no. 2 (August 1, 2020): 73. http://dx.doi.org/10.11591/ijict.v9i2.pp73-82.
Full textQi, Kunxun, and Jianfeng Du. "Translation-Based Matching Adversarial Network for Cross-Lingual Natural Language Inference." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 8632–39. http://dx.doi.org/10.1609/aaai.v34i05.6387.
Full textFerreira, Pedro M., Diogo Pernes, Ana Rebelo, and Jaime S. Cardoso. "Signer-Independent Sign Language Recognition with Adversarial Neural Networks." International Journal of Machine Learning and Computing 11, no. 2 (March 2021): 121–29. http://dx.doi.org/10.18178/ijmlc.2021.11.2.1024.
Full textTakahashi, Shuntaro, and Kumiko Tanaka-Ishii. "Evaluating Computational Language Models with Scaling Properties of Natural Language." Computational Linguistics 45, no. 3 (September 2019): 481–513. http://dx.doi.org/10.1162/coli_a_00355.
Full textP., Dr Karrupusamy. "Analysis of Neural Network Based Language Modeling." March 2020 2, no. 1 (March 30, 2020): 53–63. http://dx.doi.org/10.36548/jaicn.2020.1.006.
Full textDissertations / Theses on the topic "Neural Language Model"
Rolnic, Sergiu Gabriel. "Anonimizzazione di documenti mediante Named Entity Recognition e Neural Language Model." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2022.
Find full textLe, Hai Son. "Continuous space models with neural networks in natural language processing." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00776704.
Full textKeisala, Simon. "Using a Character-Based Language Model for Caption Generation." Thesis, Linköpings universitet, Interaktiva och kognitiva system, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-163001.
Full textGorana, Mijatović. "Dekompozicija neuralne aktivnosti: model za empirijsku karakterizaciju inter-spajk intervala." Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2018. https://www.cris.uns.ac.rs/record.jsf?recordId=107498&source=NDLTD&language=en.
Full textThe advances in extracellular neural recording techniquesresult in big data volumes that necessitate fast,reliable, and automatic identification of statisticallysimilar units. This study proposes a single frameworkyielding a compact set of probabilistic descriptors thatcharacterise the firing patterns of a single unit. Probabilisticfeatures are estimated from an inter-spikeintervaltime series, without assumptions about the firing distribution or the stationarity. The first level of proposedfiring patterns decomposition divides the inter-spikeintervals into bursting, moderate and idle firing modes,yielding a coarse feature set. The second level identifiesthe successive bursting spikes, or the spiking acceleration/deceleration in the moderate firing mode, yieldinga refined feature set. The features are estimated fromsimulated data and from experimental recordings fromthe lateral prefrontal cortex in awake, behaving rhesusmonkeys. An effcient and stable partitioning of neuralunits is provided by the ensemble evidence accumulationclustering. The possibility of selecting the number ofclusters and choosing among coarse and refined featuresets provides an opportunity to explore and comparedifferent data partitions. The estimation of features, ifapplied to a single unit, can serve as a tool for the firinganalysis, observing either overall spiking activity or theperiods of interest in trial-to-trial recordings. If applied tomassively parallel recordings, it additionally serves as aninput to the clustering procedure, with the potential tocompare the functional properties of various brainstructures and to link the types of neural cells to theparticular behavioural states.
Garagnani, Max. "Understanding language and attention : brain-based model and neurophysiological experiments." Thesis, University of Cambridge, 2009. https://www.repository.cam.ac.uk/handle/1810/243852.
Full textMiao, Yishu. "Deep generative models for natural language processing." Thesis, University of Oxford, 2017. http://ora.ox.ac.uk/objects/uuid:e4e1f1f9-e507-4754-a0ab-0246f1e1e258.
Full textAl-Kadhimi, Staffan, and Paul Löwenström. "Identification of machine-generated reviews : 1D CNN applied on the GPT-2 neural language model." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280335.
Full textI och med de senaste framstegen inom maskininlärning kan datorer skapa mer och mer övertygande text, vilket skapar en oro för ökad falsk information på internet. Samtidigt vägs detta upp genom att forskare skapar verktyg för att identifiera datorgenererad text. Forskare har kunnat utnyttja svagheter i neurala språkmodeller och använda dessa mot dem. Till exempel tillhandahåller GLTR användare en visuell representation av texter, som hjälp för att klassificera dessa som människo- skrivna eller maskingenererade. Genom att träna ett faltningsnätverk (convolutional neural network, eller CNN) på utdata från GLTR-analys av maskingenererade och människoskrivna filmrecensioner, tar vi GLTR ett steg längre och använder det för att genomföra klassifikationen automatiskt. Emellertid tycks det ej vara tillräckligt att använda en CNN med GLTR som huvuddatakälla för att klassificera på en nivå som är jämförbar med de bästa existerande metoderna.
Roos, Magnus. "Speech Comprehension : Theoretical approaches and neural correlates." Thesis, Högskolan i Skövde, Institutionen för biovetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-11240.
Full textCavallucci, Martina. "Speech Recognition per l'italiano: Sviluppo e Sperimentazione di Soluzioni Neurali con Language Model." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2022.
Find full textRossi, Alex. "Self-supervised information retrieval: a novel approach based on Deep Metric Learning and Neural Language Models." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.
Find full textBooks on the topic "Neural Language Model"
Miikkulainen, Risto. Subsymbolic natural language processing: An integrated model of scripts, lexicon, and memory. Cambridge, Mass: MIT Press, 1993.
Find full textRatcliff, Roger, and Philip Smith. Modeling Simple Decisions and Applications Using a Diffusion Model. Edited by Jerome R. Busemeyer, Zheng Wang, James T. Townsend, and Ami Eidels. Oxford University Press, 2015. http://dx.doi.org/10.1093/oxfordhb/9780199957996.013.3.
Full textCairns, Paul, Joseph P. Levy, Dimitrios Bairaktaris, and John A. Bullinaria. Connectionist Models of Memory and Language. Taylor & Francis Group, 2015.
Find full textBergen, Benjamin, and Nancy Chang. Embodied Construction Grammar. Edited by Thomas Hoffmann and Graeme Trousdale. Oxford University Press, 2013. http://dx.doi.org/10.1093/oxfordhb/9780195396683.013.0010.
Full textStrevens, Michael. The Whole Story. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199685509.003.0005.
Full text1957-, Houghton George, ed. Connectionist models in cognitive psychology. Hove: Psychology Press, 2004.
Find full textPapanicolaou, Andrew C., and Marina Kilintari. Imaging the Networks of Language. Edited by Andrew C. Papanicolaou. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199764228.013.15.
Full textGomez-Perez, Jose Manuel, Ronald Denaux, and Andres Garcia-Silva. A Practical Guide to Hybrid Natural Language Processing: Combining Neural Models and Knowledge Graphs for NLP. Springer, 2020.
Find full textMcNamara, Patrick, and Magda Giordano. Cognitive Neuroscience and Religious Language. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190636647.003.0005.
Full textBook chapters on the topic "Neural Language Model"
Soutner, Daniel, Zdeněk Loose, Luděk Müller, and Aleš Pražák. "Neural Network Language Model with Cache." In Text, Speech and Dialogue, 528–34. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-32790-2_64.
Full textKurimo, Mikko, and Krista Lagus. "An Efficiently Focusing Large Vocabulary Language Model." In Artificial Neural Networks — ICANN 2002, 1068–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-46084-5_173.
Full textEbisu, Takuma, and Ryutaro Ichise. "Representation of Relations by Planes in Neural Network Language Model." In Neural Information Processing, 300–307. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46687-3_33.
Full textMa, Dehong, Sujian Li, and Houfeng Wang. "Target Extraction via Feature-Enriched Neural Networks Model." In Natural Language Processing and Chinese Computing, 353–64. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99495-6_30.
Full textZhou, Long, Jiajun Zhang, Yang Zhao, and Chengqing Zong. "Non-autoregressive Neural Machine Translation with Distortion Model." In Natural Language Processing and Chinese Computing, 403–15. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60450-9_32.
Full textHao, Bin, Min Zhang, Weizhi Ma, Shaoyun Shi, Xinxing Yu, Houzhi Shan, Yiqun Liu, and Shaoping Ma. "Negative Feedback Aware Hybrid Sequential Neural Recommendation Model." In Natural Language Processing and Chinese Computing, 279–91. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60457-8_23.
Full textChristie, William M. "2. A Neural Network Model of Language Production." In Functional Approaches to Language, Culture and Cognition, 23. Amsterdam: John Benjamins Publishing Company, 2000. http://dx.doi.org/10.1075/cilt.163.07chr.
Full textLin, Li, Jin Liu, Zhenkai Gu, Zelun Zhang, and Haoliang Ren. "Build Chinese Language Model with Recurrent Neural Network." In Advances in Computer Science and Ubiquitous Computing, 920–25. Singapore: Springer Singapore, 2017. http://dx.doi.org/10.1007/978-981-10-7605-3_146.
Full textAthanaselis, Theologos, Stelios Bakamidis, and Ioannis Dologlou. "A Fast Algorithm for Words Reordering Based on Language Model." In Artificial Neural Networks – ICANN 2006, 943–51. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11840930_98.
Full textPanchev, Christo. "A Spiking Neural Network Model of Multi-modal Language Processing of Robot Instructions." In Biomimetic Neural Learning for Intelligent Robots, 182–210. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11521082_11.
Full textConference papers on the topic "Neural Language Model"
Yu, Seunghak, Nilesh Kulkarni, Haejun Lee, and Jihie Kim. "Syllable-level Neural Language Model for Agglutinative Language." In Proceedings of the First Workshop on Subword and Character Level Models in NLP. Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/w17-4113.
Full textLau, Jey Han, Timothy Baldwin, and Trevor Cohn. "Topically Driven Neural Language Model." In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/p17-1033.
Full textHuang, Jiaji, Yi Li, Wei Ping, and Liang Huang. "Large Margin Neural Language Model." In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1150.
Full textWu, Liwei, Youhua Wu, Fei Li, and Tao Zheng. "An Improved Recurrent Neural Network Language Model for Programming Language." In 2019 International Joint Conference on Neural Networks (IJCNN). IEEE, 2019. http://dx.doi.org/10.1109/ijcnn.2019.8852433.
Full textChowdhury, Hemayet Ahmed, Md Azizul Haque Imon, Anisur Rahman, Aisha Khatun, and Md Saiful Islam. "A Continuous Space Neural Language Model for Bengali Language." In 2019 22nd International Conference on Computer and Information Technology (ICCIT). IEEE, 2019. http://dx.doi.org/10.1109/iccit48885.2019.9038568.
Full textChien, Jen-Tzung, and Yuan-Chu Ku. "Bayesian recurrent neural network language model." In 2014 IEEE Spoken Language Technology Workshop (SLT). IEEE, 2014. http://dx.doi.org/10.1109/slt.2014.7078575.
Full textChien, Jen-Tzung, and Che-Yu Kuo. "Markov Recurrent Neural Network Language Model." In 2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU). IEEE, 2019. http://dx.doi.org/10.1109/asru46091.2019.9003850.
Full textShi, YongZhe, Wei-Qiang Zhang, Meng Cai, and Jia Liu. "Temporal kernel neural network language model." In ICASSP 2013 - 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2013. http://dx.doi.org/10.1109/icassp.2013.6639273.
Full textKawamae, Noriaki. "Topic Structure-Aware Neural Language Model." In The World Wide Web Conference. New York, New York, USA: ACM Press, 2019. http://dx.doi.org/10.1145/3308558.3313757.
Full textAlumäe, Tanel. "Multi-domain neural network language model." In Interspeech 2013. ISCA: ISCA, 2013. http://dx.doi.org/10.21437/interspeech.2013-515.
Full textReports on the topic "Neural Language Model"
Althoff, J. L., M. L. Apicella, and S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 5. Neutral Data Definition Language (NDDL) Development Specification. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada252450.
Full textAlthoff, J., and M. Apicella. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 9. Neutral Data Manipulation Language (NDML) Precompiler Development Specification. Section 2. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada252526.
Full textApicella, M. L., J. Slaton, and B. Levi. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 10. Neutral Data Manipulation Language (NDML) Precompiler Control Module Product Specification. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada250451.
Full textApicella, M. L., J. Slaton, and B. Levi. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 13. Neutral Data Manipulation Language (NDML) Precompiler Parse NDML Product Specification. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada250453.
Full textAlthoff, J., M. Apicella, and S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 3 of 6. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada251997.
Full textAlthoff, J., M. Apicella, and S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 4 of 6. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada251998.
Full textAlthoff, J., M. Apicella, and S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 5 of 6. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada251999.
Full textAlthoff, J., M. Apicella, and S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 6 of 6. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada252053.
Full textApicella, M. L., J. Slaton, and B. Levi. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 12. Neutral Data Manipulation Language (NDML) Precompiler Parse Procedure Division Product Specification. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada250452.
Full textApicella, M. L., J. Slaton, B. Levi, and A. Pashak. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 23. Neutral Data Manipulation Language (NDML) Precompiler Build Source Code Product Specification. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada250460.
Full text