Journal articles on the topic 'Word Vector Models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Word Vector Models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Budenkov, S. S. "SEMANTIC WORD VECTOR MODELS FOR SENTIMENT ANALYSIS." Scientific and Technical Volga region Bulletin 7, no. 2 (2017): 75–78. http://dx.doi.org/10.24153/2079-5920-2017-7-2-75-78.
Full textHaroon, Muhammad, Junaid Baber, Ihsan Ullah, Sher Muhammad Daudpota, Maheen Bakhtyar, and Varsha Devi. "Video Scene Detection Using Compact Bag of Visual Word Models." Advances in Multimedia 2018 (November 8, 2018): 1–9. http://dx.doi.org/10.1155/2018/2564963.
Full textMa, Zhiyang, Wenfeng Zheng, Xiaobing Chen, and Lirong Yin. "Joint embedding VQA model based on dynamic word vector." PeerJ Computer Science 7 (March 3, 2021): e353. http://dx.doi.org/10.7717/peerj-cs.353.
Full textNishida, Satoshi, Antoine Blanc, Naoya Maeda, Masataka Kado, and Shinji Nishimoto. "Behavioral correlates of cortical semantic representations modeled by word vectors." PLOS Computational Biology 17, no. 6 (2021): e1009138. http://dx.doi.org/10.1371/journal.pcbi.1009138.
Full textTissier, Julien, Christophe Gravier, and Amaury Habrard. "Near-Lossless Binarization of Word Embeddings." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 7104–11. http://dx.doi.org/10.1609/aaai.v33i01.33017104.
Full textSassenhagen, Jona, and Christian J. Fiebach. "Traces of Meaning Itself: Encoding Distributional Word Vectors in Brain Activity." Neurobiology of Language 1, no. 1 (2020): 54–76. http://dx.doi.org/10.1162/nol_a_00003.
Full textBojanowski, Piotr, Edouard Grave, Armand Joulin, and Tomas Mikolov. "Enriching Word Vectors with Subword Information." Transactions of the Association for Computational Linguistics 5 (December 2017): 135–46. http://dx.doi.org/10.1162/tacl_a_00051.
Full textXu, Beibei, Zhiying Tan, Kenli Li, Taijiao Jiang, and Yousong Peng. "Predicting the host of influenza viruses based on the word vector." PeerJ 5 (July 18, 2017): e3579. http://dx.doi.org/10.7717/peerj.3579.
Full textNguyen, Dat Quoc, Richard Billingsley, Lan Du, and Mark Johnson. "Improving Topic Models with Latent Feature Word Representations." Transactions of the Association for Computational Linguistics 3 (December 2015): 299–313. http://dx.doi.org/10.1162/tacl_a_00140.
Full textLi, Zhen, Dan Qu, Yanxia Li, Chaojie Xie, and Qi Chen. "A Position Weighted Information Based Word Embedding Model for Machine Translation." International Journal on Artificial Intelligence Tools 29, no. 07n08 (2020): 2040005. http://dx.doi.org/10.1142/s0218213020400059.
Full textLosieva, Y. "Representation of Words in Natural Language Processing: A Survey." Bulletin of Taras Shevchenko National University of Kyiv. Series: Physics and Mathematics, no. 2 (2019): 82–87. http://dx.doi.org/10.17721/1812-5409.2019/2.10.
Full textBhatta, Janardan, Dipesh Shrestha, Santosh Nepal, Saurav Pandey, and Shekhar Koirala. "Efficient Estimation of Nepali Word Representations in Vector Space." Journal of Innovations in Engineering Education 3, no. 1 (2020): 71–77. http://dx.doi.org/10.3126/jiee.v3i1.34327.
Full textYeh, Hsiang-Yuan, Yu-Ching Yeh, and Da-Bai Shen. "Word Vector Models Approach to Text Regression of Financial Risk Prediction." Symmetry 12, no. 1 (2020): 89. http://dx.doi.org/10.3390/sym12010089.
Full textPaperno, Denis, and Marco Baroni. "When the Whole Is Less Than the Sum of Its Parts: How Composition Affects PMI Values in Distributional Semantic Vectors." Computational Linguistics 42, no. 2 (2016): 345–50. http://dx.doi.org/10.1162/coli_a_00250.
Full textLu, Wei, Kailun Shi, Yuanyuan Cai, and Xiaoping Che. "Semantic Similarity Measurement Using Knowledge-Augmented Multiple-prototype Distributed Word Vector." International Journal of Interdisciplinary Telecommunications and Networking 8, no. 2 (2016): 45–57. http://dx.doi.org/10.4018/ijitn.2016040105.
Full textRobnik-Šikonja, Marko, Kristjan Reba, and Igor Mozetič. "Cross-lingual transfer of sentiment classifiers." Slovenščina 2.0: empirical, applied and interdisciplinary research 9, no. 1 (2021): 1–25. http://dx.doi.org/10.4312/slo2.0.2021.1.1-25.
Full textMao, Xingliang, Shuai Chang, Jinjing Shi, Fangfang Li, and Ronghua Shi. "Sentiment-Aware Word Embedding for Emotion Classification." Applied Sciences 9, no. 7 (2019): 1334. http://dx.doi.org/10.3390/app9071334.
Full textChatterjee, Soma, and Kamal Sarkar. "Combining IR Models for Bengali Information Retrieval." International Journal of Information Retrieval Research 8, no. 3 (2018): 68–83. http://dx.doi.org/10.4018/ijirr.2018070105.
Full textMrkšić, Nikola, Ivan Vulić, Diarmuid Ó. Séaghdha, et al. "Semantic Specialization of Distributional Word Vector Spaces using Monolingual and Cross-Lingual Constraints." Transactions of the Association for Computational Linguistics 5 (December 2017): 309–24. http://dx.doi.org/10.1162/tacl_a_00063.
Full textYang, Hejung, Young-In Lee, Hyun-jung Lee, Sook Whan Cho, and Myoung-Wan Koo. "A Study on Word Vector Models for Representing Korean Semantic Information." Phonetics and Speech Sciences 7, no. 4 (2015): 41–47. http://dx.doi.org/10.13064/ksss.2015.7.4.041.
Full textErk, Katrin. "Vector Space Models of Word Meaning and Phrase Meaning: A Survey." Language and Linguistics Compass 6, no. 10 (2012): 635–53. http://dx.doi.org/10.1002/lnco.362.
Full textYe, Na, Xin Qin, Lili Dong, Xiang Zhang, and Kangkang Sun. "Chinese Named Entity Recognition Based on Character-Word Vector Fusion." Wireless Communications and Mobile Computing 2020 (July 4, 2020): 1–7. http://dx.doi.org/10.1155/2020/8866540.
Full textDa’u, Aminu, and Naomie Salim. "Aspect extraction on user textual reviews using multi-channel convolutional neural network." PeerJ Computer Science 5 (May 6, 2019): e191. http://dx.doi.org/10.7717/peerj-cs.191.
Full textZhao, Fuqiang, Zhengyu Zhu, and Ping Han. "A novel model for semantic similarity measurement based on wordnet and word embedding." Journal of Intelligent & Fuzzy Systems 40, no. 5 (2021): 9831–42. http://dx.doi.org/10.3233/jifs-202337.
Full textAbdelkader, Mostefai, and Mekour Mansour. "A Method Based on a New Word Embedding Approach for Process Model Matching." International Journal of Artificial Intelligence and Machine Learning 11, no. 1 (2021): 1–14. http://dx.doi.org/10.4018/ijaiml.2021010101.
Full textNaseem, Usman, Imran Razzak, Shah Khalid Khan, and Mukesh Prasad. "A Comprehensive Survey on Word Representation Models: From Classical to State-of-the-Art Word Representation Language Models." ACM Transactions on Asian and Low-Resource Language Information Processing 20, no. 5 (2021): 1–35. http://dx.doi.org/10.1145/3434237.
Full textArthur O. Santos, Flávio, Thiago Dias Bispo, Hendrik Teixeira Macedo, and Cleber Zanchettin. "Morphological Skip-Gram: Replacing FastText characters n-gram with morphological knowledge." Inteligencia Artificial 24, no. 67 (2021): 1–17. http://dx.doi.org/10.4114/intartif.vol24iss67pp1-17.
Full textCotterell, Ryan, and Hinrich Schütze. "Joint Semantic Synthesis and Morphological Analysis of the Derived Word." Transactions of the Association for Computational Linguistics 6 (December 2018): 33–48. http://dx.doi.org/10.1162/tacl_a_00003.
Full textZhu, Lixing, Yulan He, and Deyu Zhou. "A Neural Generative Model for Joint Learning Topics and Topic-Specific Word Embeddings." Transactions of the Association for Computational Linguistics 8 (August 2020): 471–85. http://dx.doi.org/10.1162/tacl_a_00326.
Full textKrishnamurthy, Balaji, Nikaash Puri, and Raghavender Goel. "Learning Vector-space Representations of Items for Recommendations Using Word Embedding Models." Procedia Computer Science 80 (2016): 2205–10. http://dx.doi.org/10.1016/j.procs.2016.05.380.
Full textLauscher, Anne, Goran Glavaš, Simone Paolo Ponzetto, and Ivan Vulić. "A General Framework for Implicit and Explicit Debiasing of Distributional Word Vector Spaces." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 8131–38. http://dx.doi.org/10.1609/aaai.v34i05.6325.
Full textEl-Alami, Fatima-Zahra, Said Ouatik El Alaoui, and Noureddine En-Nahnahi. "Deep Neural Models and Retrofitting for Arabic Text Categorization." International Journal of Intelligent Information Technologies 16, no. 2 (2020): 74–86. http://dx.doi.org/10.4018/ijiit.2020040104.
Full textAl Mahmud, Nahyan, and Shahfida Amjad Munni. "Qualitative Analysis of PLP in LSTM for Bangla Speech Recognition." International journal of Multimedia & Its Applications 12, no. 5 (2020): 1–8. http://dx.doi.org/10.5121/ijma.2020.12501.
Full textJiang, Shengchen, Yantuan Xian, Hongbin Wang, Zhiju Zhang, and Huaqin Li. "Representation Learning with LDA Models for Entity Disambiguation in Specific Domains." Journal of Advanced Computational Intelligence and Intelligent Informatics 25, no. 3 (2021): 326–34. http://dx.doi.org/10.20965/jaciii.2021.p0326.
Full textVIRPIOJA, SAMI, MARI-SANNA PAUKKERI, ABHISHEK TRIPATHI, TIINA LINDH-KNUUTILA, and KRISTA LAGUS. "Evaluating vector space models with canonical correlation analysis." Natural Language Engineering 18, no. 3 (2011): 399–436. http://dx.doi.org/10.1017/s1351324911000271.
Full textGarabík, Radovan. "Word Embedding Based on Large-Scale Web Corpora as a Powerful Lexicographic Tool." Rasprave Instituta za hrvatski jezik i jezikoslovlje 46, no. 2 (2020): 603–18. http://dx.doi.org/10.31724/rihjj.46.2.8.
Full textSocher, Richard, Andrej Karpathy, Quoc V. Le, Christopher D. Manning, and Andrew Y. Ng. "Grounded Compositional Semantics for Finding and Describing Images with Sentences." Transactions of the Association for Computational Linguistics 2 (December 2014): 207–18. http://dx.doi.org/10.1162/tacl_a_00177.
Full textDehghan, M., K. Faez, M. Ahmadi, and M. Shridhar. "Unconstrained Farsi handwritten word recognition using fuzzy vector quantization and hidden Markov models." Pattern Recognition Letters 22, no. 2 (2001): 209–14. http://dx.doi.org/10.1016/s0167-8655(00)00090-8.
Full textTURNEY, P. D., and S. M. MOHAMMAD. "Experiments with three approaches to recognizing lexical entailment." Natural Language Engineering 21, no. 3 (2014): 437–76. http://dx.doi.org/10.1017/s1351324913000387.
Full textKumar, Vaibhav, Tenzin Singhay Bhotia, Vaibhav Kumar, and Tanmoy Chakraborty. "Nurse is Closer to Woman than Surgeon? Mitigating Gender-Biased Proximities in Word Embeddings." Transactions of the Association for Computational Linguistics 8 (August 2020): 486–503. http://dx.doi.org/10.1162/tacl_a_00327.
Full textFiok, Krzysztof, Waldemar Karwowski, Edgar Gutierrez, and Mohammad Reza-Davahli. "Comparing the Quality and Speed of Sentence Classification with Modern Language Models." Applied Sciences 10, no. 10 (2020): 3386. http://dx.doi.org/10.3390/app10103386.
Full textPadó, Sebastian, and Mirella Lapata. "Dependency-Based Construction of Semantic Space Models." Computational Linguistics 33, no. 2 (2007): 161–99. http://dx.doi.org/10.1162/coli.2007.33.2.161.
Full textArora, Sanjeev, Yuanzhi Li, Yingyu Liang, Tengyu Ma, and Andrej Risteski. "A Latent Variable Model Approach to PMI-based Word Embeddings." Transactions of the Association for Computational Linguistics 4 (December 2016): 385–99. http://dx.doi.org/10.1162/tacl_a_00106.
Full textTurney, P. D., and P. Pantel. "From Frequency to Meaning: Vector Space Models of Semantics." Journal of Artificial Intelligence Research 37 (February 27, 2010): 141–88. http://dx.doi.org/10.1613/jair.2934.
Full textDi Carlo, Valerio, Federico Bianchi, and Matteo Palmonari. "Training Temporal Word Embeddings with a Compass." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 6326–34. http://dx.doi.org/10.1609/aaai.v33i01.33016326.
Full textLiu, Chang, Pengyuan Zhang, Ta Li, and Yonghong Yan. "Semantic Features Based N-Best Rescoring Methods for Automatic Speech Recognition." Applied Sciences 9, no. 23 (2019): 5053. http://dx.doi.org/10.3390/app9235053.
Full textChen, Xiaojun. "Synthetic Network and Search Filter Algorithm in English Oral Duplicate Correction Map." Complexity 2021 (April 13, 2021): 1–12. http://dx.doi.org/10.1155/2021/9960101.
Full textSun, Yanfeng, Minglei Zhang, Si Chen, and Xiaohu Shi. "A Financial Embedded Vector Model and Its Applications to Time Series Forecasting." International Journal of Computers Communications & Control 13, no. 5 (2018): 881–94. http://dx.doi.org/10.15837/ijccc.2018.5.3286.
Full textCamacho-Collados, Jose, and Mohammad Taher Pilehvar. "From Word To Sense Embeddings: A Survey on Vector Representations of Meaning." Journal of Artificial Intelligence Research 63 (December 6, 2018): 743–88. http://dx.doi.org/10.1613/jair.1.11259.
Full textZhou, Wang, Sun, and Sun. "A Method of Short Text Representation Based on the Feature Probability Embedded Vector." Sensors 19, no. 17 (2019): 3728. http://dx.doi.org/10.3390/s19173728.
Full text