Letteratura scientifica selezionata sul tema "Neural Language Model"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Neural Language Model".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Neural Language Model"
Emami, Ahmad, e Frederick Jelinek. "A Neural Syntactic Language Model". Machine Learning 60, n. 1-3 (2 giugno 2005): 195–227. http://dx.doi.org/10.1007/s10994-005-0916-y.
Testo completoBuckman, Jacob, e Graham Neubig. "Neural Lattice Language Models". Transactions of the Association for Computational Linguistics 6 (dicembre 2018): 529–41. http://dx.doi.org/10.1162/tacl_a_00036.
Testo completoZhang, Yike, Pengyuan Zhang e Yonghong Yan. "Tailoring an Interpretable Neural Language Model". IEEE/ACM Transactions on Audio, Speech, and Language Processing 27, n. 7 (luglio 2019): 1164–78. http://dx.doi.org/10.1109/taslp.2019.2913087.
Testo completoKunchukuttan, Anoop, Mitesh Khapra, Gurneet Singh e Pushpak Bhattacharyya. "Leveraging Orthographic Similarity for Multilingual Neural Transliteration". Transactions of the Association for Computational Linguistics 6 (dicembre 2018): 303–16. http://dx.doi.org/10.1162/tacl_a_00022.
Testo completoTang, Zhiyuan, Dong Wang, Yixiang Chen, Lantian Li e Andrew Abel. "Phonetic Temporal Neural Model for Language Identification". IEEE/ACM Transactions on Audio, Speech, and Language Processing 26, n. 1 (gennaio 2018): 134–44. http://dx.doi.org/10.1109/taslp.2017.2764271.
Testo completoSouri, Adnan, Mohammed Al Achhab, Badr Eddine Elmohajir e Abdelali Zbakh. "Neural network dealing with Arabic language". International Journal of Informatics and Communication Technology (IJ-ICT) 9, n. 2 (1 agosto 2020): 73. http://dx.doi.org/10.11591/ijict.v9i2.pp73-82.
Testo completoQi, Kunxun, e Jianfeng Du. "Translation-Based Matching Adversarial Network for Cross-Lingual Natural Language Inference". Proceedings of the AAAI Conference on Artificial Intelligence 34, n. 05 (3 aprile 2020): 8632–39. http://dx.doi.org/10.1609/aaai.v34i05.6387.
Testo completoFerreira, Pedro M., Diogo Pernes, Ana Rebelo e Jaime S. Cardoso. "Signer-Independent Sign Language Recognition with Adversarial Neural Networks". International Journal of Machine Learning and Computing 11, n. 2 (marzo 2021): 121–29. http://dx.doi.org/10.18178/ijmlc.2021.11.2.1024.
Testo completoTakahashi, Shuntaro, e Kumiko Tanaka-Ishii. "Evaluating Computational Language Models with Scaling Properties of Natural Language". Computational Linguistics 45, n. 3 (settembre 2019): 481–513. http://dx.doi.org/10.1162/coli_a_00355.
Testo completoP., Dr Karrupusamy. "Analysis of Neural Network Based Language Modeling". March 2020 2, n. 1 (30 marzo 2020): 53–63. http://dx.doi.org/10.36548/jaicn.2020.1.006.
Testo completoTesi sul tema "Neural Language Model"
Rolnic, Sergiu Gabriel. "Anonimizzazione di documenti mediante Named Entity Recognition e Neural Language Model". Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2022.
Cerca il testo completoLe, Hai Son. "Continuous space models with neural networks in natural language processing". Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00776704.
Testo completoKeisala, Simon. "Using a Character-Based Language Model for Caption Generation". Thesis, Linköpings universitet, Interaktiva och kognitiva system, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-163001.
Testo completoGorana, Mijatović. "Dekompozicija neuralne aktivnosti: model za empirijsku karakterizaciju inter-spajk intervala". Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2018. https://www.cris.uns.ac.rs/record.jsf?recordId=107498&source=NDLTD&language=en.
Testo completoThe advances in extracellular neural recording techniquesresult in big data volumes that necessitate fast,reliable, and automatic identification of statisticallysimilar units. This study proposes a single frameworkyielding a compact set of probabilistic descriptors thatcharacterise the firing patterns of a single unit. Probabilisticfeatures are estimated from an inter-spikeintervaltime series, without assumptions about the firing distribution or the stationarity. The first level of proposedfiring patterns decomposition divides the inter-spikeintervals into bursting, moderate and idle firing modes,yielding a coarse feature set. The second level identifiesthe successive bursting spikes, or the spiking acceleration/deceleration in the moderate firing mode, yieldinga refined feature set. The features are estimated fromsimulated data and from experimental recordings fromthe lateral prefrontal cortex in awake, behaving rhesusmonkeys. An effcient and stable partitioning of neuralunits is provided by the ensemble evidence accumulationclustering. The possibility of selecting the number ofclusters and choosing among coarse and refined featuresets provides an opportunity to explore and comparedifferent data partitions. The estimation of features, ifapplied to a single unit, can serve as a tool for the firinganalysis, observing either overall spiking activity or theperiods of interest in trial-to-trial recordings. If applied tomassively parallel recordings, it additionally serves as aninput to the clustering procedure, with the potential tocompare the functional properties of various brainstructures and to link the types of neural cells to theparticular behavioural states.
Garagnani, Max. "Understanding language and attention : brain-based model and neurophysiological experiments". Thesis, University of Cambridge, 2009. https://www.repository.cam.ac.uk/handle/1810/243852.
Testo completoMiao, Yishu. "Deep generative models for natural language processing". Thesis, University of Oxford, 2017. http://ora.ox.ac.uk/objects/uuid:e4e1f1f9-e507-4754-a0ab-0246f1e1e258.
Testo completoAl-Kadhimi, Staffan, e Paul Löwenström. "Identification of machine-generated reviews : 1D CNN applied on the GPT-2 neural language model". Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280335.
Testo completoI och med de senaste framstegen inom maskininlärning kan datorer skapa mer och mer övertygande text, vilket skapar en oro för ökad falsk information på internet. Samtidigt vägs detta upp genom att forskare skapar verktyg för att identifiera datorgenererad text. Forskare har kunnat utnyttja svagheter i neurala språkmodeller och använda dessa mot dem. Till exempel tillhandahåller GLTR användare en visuell representation av texter, som hjälp för att klassificera dessa som människo- skrivna eller maskingenererade. Genom att träna ett faltningsnätverk (convolutional neural network, eller CNN) på utdata från GLTR-analys av maskingenererade och människoskrivna filmrecensioner, tar vi GLTR ett steg längre och använder det för att genomföra klassifikationen automatiskt. Emellertid tycks det ej vara tillräckligt att använda en CNN med GLTR som huvuddatakälla för att klassificera på en nivå som är jämförbar med de bästa existerande metoderna.
Roos, Magnus. "Speech Comprehension : Theoretical approaches and neural correlates". Thesis, Högskolan i Skövde, Institutionen för biovetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-11240.
Testo completoCavallucci, Martina. "Speech Recognition per l'italiano: Sviluppo e Sperimentazione di Soluzioni Neurali con Language Model". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2022.
Cerca il testo completoRossi, Alex. "Self-supervised information retrieval: a novel approach based on Deep Metric Learning and Neural Language Models". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.
Cerca il testo completoLibri sul tema "Neural Language Model"
Miikkulainen, Risto. Subsymbolic natural language processing: An integrated model of scripts, lexicon, and memory. Cambridge, Mass: MIT Press, 1993.
Cerca il testo completoRatcliff, Roger, e Philip Smith. Modeling Simple Decisions and Applications Using a Diffusion Model. A cura di Jerome R. Busemeyer, Zheng Wang, James T. Townsend e Ami Eidels. Oxford University Press, 2015. http://dx.doi.org/10.1093/oxfordhb/9780199957996.013.3.
Testo completoCairns, Paul, Joseph P. Levy, Dimitrios Bairaktaris e John A. Bullinaria. Connectionist Models of Memory and Language. Taylor & Francis Group, 2015.
Cerca il testo completoBergen, Benjamin, e Nancy Chang. Embodied Construction Grammar. A cura di Thomas Hoffmann e Graeme Trousdale. Oxford University Press, 2013. http://dx.doi.org/10.1093/oxfordhb/9780195396683.013.0010.
Testo completoStrevens, Michael. The Whole Story. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199685509.003.0005.
Testo completo1957-, Houghton George, a cura di. Connectionist models in cognitive psychology. Hove: Psychology Press, 2004.
Cerca il testo completoConnectionist Models in Cognitive Psychology. Taylor & Francis Group, 2014.
Cerca il testo completoPapanicolaou, Andrew C., e Marina Kilintari. Imaging the Networks of Language. A cura di Andrew C. Papanicolaou. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199764228.013.15.
Testo completoGomez-Perez, Jose Manuel, Ronald Denaux e Andres Garcia-Silva. A Practical Guide to Hybrid Natural Language Processing: Combining Neural Models and Knowledge Graphs for NLP. Springer, 2020.
Cerca il testo completoMcNamara, Patrick, e Magda Giordano. Cognitive Neuroscience and Religious Language. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190636647.003.0005.
Testo completoCapitoli di libri sul tema "Neural Language Model"
Soutner, Daniel, Zdeněk Loose, Luděk Müller e Aleš Pražák. "Neural Network Language Model with Cache". In Text, Speech and Dialogue, 528–34. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-32790-2_64.
Testo completoKurimo, Mikko, e Krista Lagus. "An Efficiently Focusing Large Vocabulary Language Model". In Artificial Neural Networks — ICANN 2002, 1068–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-46084-5_173.
Testo completoEbisu, Takuma, e Ryutaro Ichise. "Representation of Relations by Planes in Neural Network Language Model". In Neural Information Processing, 300–307. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46687-3_33.
Testo completoMa, Dehong, Sujian Li e Houfeng Wang. "Target Extraction via Feature-Enriched Neural Networks Model". In Natural Language Processing and Chinese Computing, 353–64. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99495-6_30.
Testo completoZhou, Long, Jiajun Zhang, Yang Zhao e Chengqing Zong. "Non-autoregressive Neural Machine Translation with Distortion Model". In Natural Language Processing and Chinese Computing, 403–15. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60450-9_32.
Testo completoHao, Bin, Min Zhang, Weizhi Ma, Shaoyun Shi, Xinxing Yu, Houzhi Shan, Yiqun Liu e Shaoping Ma. "Negative Feedback Aware Hybrid Sequential Neural Recommendation Model". In Natural Language Processing and Chinese Computing, 279–91. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60457-8_23.
Testo completoChristie, William M. "2. A Neural Network Model of Language Production". In Functional Approaches to Language, Culture and Cognition, 23. Amsterdam: John Benjamins Publishing Company, 2000. http://dx.doi.org/10.1075/cilt.163.07chr.
Testo completoLin, Li, Jin Liu, Zhenkai Gu, Zelun Zhang e Haoliang Ren. "Build Chinese Language Model with Recurrent Neural Network". In Advances in Computer Science and Ubiquitous Computing, 920–25. Singapore: Springer Singapore, 2017. http://dx.doi.org/10.1007/978-981-10-7605-3_146.
Testo completoAthanaselis, Theologos, Stelios Bakamidis e Ioannis Dologlou. "A Fast Algorithm for Words Reordering Based on Language Model". In Artificial Neural Networks – ICANN 2006, 943–51. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11840930_98.
Testo completoPanchev, Christo. "A Spiking Neural Network Model of Multi-modal Language Processing of Robot Instructions". In Biomimetic Neural Learning for Intelligent Robots, 182–210. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11521082_11.
Testo completoAtti di convegni sul tema "Neural Language Model"
Yu, Seunghak, Nilesh Kulkarni, Haejun Lee e Jihie Kim. "Syllable-level Neural Language Model for Agglutinative Language". In Proceedings of the First Workshop on Subword and Character Level Models in NLP. Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/w17-4113.
Testo completoLau, Jey Han, Timothy Baldwin e Trevor Cohn. "Topically Driven Neural Language Model". In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/p17-1033.
Testo completoHuang, Jiaji, Yi Li, Wei Ping e Liang Huang. "Large Margin Neural Language Model". In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1150.
Testo completoWu, Liwei, Youhua Wu, Fei Li e Tao Zheng. "An Improved Recurrent Neural Network Language Model for Programming Language". In 2019 International Joint Conference on Neural Networks (IJCNN). IEEE, 2019. http://dx.doi.org/10.1109/ijcnn.2019.8852433.
Testo completoChowdhury, Hemayet Ahmed, Md Azizul Haque Imon, Anisur Rahman, Aisha Khatun e Md Saiful Islam. "A Continuous Space Neural Language Model for Bengali Language". In 2019 22nd International Conference on Computer and Information Technology (ICCIT). IEEE, 2019. http://dx.doi.org/10.1109/iccit48885.2019.9038568.
Testo completoChien, Jen-Tzung, e Yuan-Chu Ku. "Bayesian recurrent neural network language model". In 2014 IEEE Spoken Language Technology Workshop (SLT). IEEE, 2014. http://dx.doi.org/10.1109/slt.2014.7078575.
Testo completoChien, Jen-Tzung, e Che-Yu Kuo. "Markov Recurrent Neural Network Language Model". In 2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU). IEEE, 2019. http://dx.doi.org/10.1109/asru46091.2019.9003850.
Testo completoShi, YongZhe, Wei-Qiang Zhang, Meng Cai e Jia Liu. "Temporal kernel neural network language model". In ICASSP 2013 - 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2013. http://dx.doi.org/10.1109/icassp.2013.6639273.
Testo completoKawamae, Noriaki. "Topic Structure-Aware Neural Language Model". In The World Wide Web Conference. New York, New York, USA: ACM Press, 2019. http://dx.doi.org/10.1145/3308558.3313757.
Testo completoAlumäe, Tanel. "Multi-domain neural network language model". In Interspeech 2013. ISCA: ISCA, 2013. http://dx.doi.org/10.21437/interspeech.2013-515.
Testo completoRapporti di organizzazioni sul tema "Neural Language Model"
Althoff, J. L., M. L. Apicella e S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 5. Neutral Data Definition Language (NDDL) Development Specification. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada252450.
Testo completoAlthoff, J., e M. Apicella. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 9. Neutral Data Manipulation Language (NDML) Precompiler Development Specification. Section 2. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada252526.
Testo completoApicella, M. L., J. Slaton e B. Levi. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 10. Neutral Data Manipulation Language (NDML) Precompiler Control Module Product Specification. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada250451.
Testo completoApicella, M. L., J. Slaton e B. Levi. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 13. Neutral Data Manipulation Language (NDML) Precompiler Parse NDML Product Specification. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada250453.
Testo completoAlthoff, J., M. Apicella e S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 3 of 6. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada251997.
Testo completoAlthoff, J., M. Apicella e S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 4 of 6. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada251998.
Testo completoAlthoff, J., M. Apicella e S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 5 of 6. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada251999.
Testo completoAlthoff, J., M. Apicella e S. Singh. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 6. Neutral Data Definition Language (NDDL) Product Specification. Section 6 of 6. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada252053.
Testo completoApicella, M. L., J. Slaton e B. Levi. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 12. Neutral Data Manipulation Language (NDML) Precompiler Parse Procedure Division Product Specification. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada250452.
Testo completoApicella, M. L., J. Slaton, B. Levi e A. Pashak. Integrated Information Support System (IISS). Volume 5. Common Data Model Subsystem. Part 23. Neutral Data Manipulation Language (NDML) Precompiler Build Source Code Product Specification. Fort Belvoir, VA: Defense Technical Information Center, settembre 1990. http://dx.doi.org/10.21236/ada250460.
Testo completo