Статті в журналах з теми "Low resource language"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Low resource language".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Pakray, Partha, Alexander Gelbukh, and Sivaji Bandyopadhyay. "Natural language processing applications for low-resource languages." Natural Language Processing 31, no. 2 (2025): 183–97. https://doi.org/10.1017/nlp.2024.33.
Повний текст джерелаLin, Donghui, Yohei Murakami, and Toru Ishida. "Towards Language Service Creation and Customization for Low-Resource Languages." Information 11, no. 2 (2020): 67. http://dx.doi.org/10.3390/info11020067.
Повний текст джерелаRanasinghe, Tharindu, and Marcos Zampieri. "Multilingual Offensive Language Identification for Low-resource Languages." ACM Transactions on Asian and Low-Resource Language Information Processing 21, no. 1 (2022): 1–13. http://dx.doi.org/10.1145/3457610.
Повний текст джерелаCassano, Federico, John Gouwar, Francesca Lucchetti, et al. "Knowledge Transfer from High-Resource to Low-Resource Programming Languages for Code LLMs." Proceedings of the ACM on Programming Languages 8, OOPSLA2 (2024): 677–708. http://dx.doi.org/10.1145/3689735.
Повний текст джерелаAbigail Rai. "Part-of-Speech (POS) Tagging of Low-Resource Language (Limbu) with Deep learning." Panamerican Mathematical Journal 35, no. 1s (2024): 149–57. http://dx.doi.org/10.52783/pmj.v35.i1s.2297.
Повний текст джерелаNitu, Melania, and Mihai Dascalu. "Natural Language Processing Tools for Romanian – Going Beyond a Low-Resource Language." Interaction Design and Architecture(s), no. 60 (March 15, 2024): 7–26. http://dx.doi.org/10.55612/s-5002-060-001sp.
Повний текст джерелаZhou, Shuyan, Shruti Rijhwani, John Wieting, Jaime Carbonell, and Graham Neubig. "Improving Candidate Generation for Low-resource Cross-lingual Entity Linking." Transactions of the Association for Computational Linguistics 8 (July 2020): 109–24. http://dx.doi.org/10.1162/tacl_a_00303.
Повний текст джерелаVargas, Francielle, Wolfgang Schmeisser-Nieto, Zohar Rabinovich, Thiago A. S. Pardo, and Fabrício Benevenuto. "Discourse annotation guideline for low-resource languages." Natural Language Processing 31, no. 2 (2025): 700–743. https://doi.org/10.1017/nlp.2024.19.
Повний текст джерелаLi, Zihao, Yucheng Shi, Zirui Liu, et al. "Language Ranker: A Metric for Quantifying LLM Performance Across High and Low-Resource Languages." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 27 (2025): 28186–94. https://doi.org/10.1609/aaai.v39i27.35038.
Повний текст джерелаAzragul Yusup, Azragul Yusup, Degang Chen Azragul Yusup, Yifei Ge Degang Chen, Hongliang Mao Yifei Ge, and Nujian Wang Hongliang Mao. "Resource Construction and Ensemble Learning based Sentiment Analysis for the Low-resource Language Uyghur." 網際網路技術學刊 24, no. 4 (2023): 1009–16. http://dx.doi.org/10.53106/160792642023072404018.
Повний текст джерелаMati, Diellza Nagavci, Mentor Hamiti, Arsim Susuri, Besnik Selimi, and Jaumin Ajdari. "Building Dictionaries for Low Resource Languages: Challenges of Unsupervised Learning." Annals of Emerging Technologies in Computing 5, no. 3 (2021): 52–58. http://dx.doi.org/10.33166/aetic.2021.03.005.
Повний текст джерелаKashyap, Gaurav. "Multilingual NLP: Techniques for Creating Models that Understand and Generate Multiple Languages with Minimal Resources." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 12 (2024): 1–5. https://doi.org/10.55041/ijsrem7648.
Повний текст джерелаRijhwani, Shruti, Jiateng Xie, Graham Neubig, and Jaime Carbonell. "Zero-Shot Neural Transfer for Cross-Lingual Entity Linking." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 6924–31. http://dx.doi.org/10.1609/aaai.v33i01.33016924.
Повний текст джерелаQarah, Faisal, and Tawfeeq Alsanoosy. "Evaluation of Arabic Large Language Models on Moroccan Dialect." Engineering, Technology & Applied Science Research 15, no. 3 (2025): 22478–85. https://doi.org/10.48084/etasr.10331.
Повний текст джерелаLee, Chanhee, Kisu Yang, Taesun Whang, Chanjun Park, Andrew Matteson, and Heuiseok Lim. "Exploring the Data Efficiency of Cross-Lingual Post-Training in Pretrained Language Models." Applied Sciences 11, no. 5 (2021): 1974. http://dx.doi.org/10.3390/app11051974.
Повний текст джерелаLee, Jaeseong, Dohyeon Lee, and Seung-won Hwang. "Script, Language, and Labels: Overcoming Three Discrepancies for Low-Resource Language Specialization." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 11 (2023): 13004–13. http://dx.doi.org/10.1609/aaai.v37i11.26528.
Повний текст джерелаMozafari, Marzieh, Khouloud Mnassri, Reza Farahbakhsh, and Noel Crespi. "Offensive language detection in low resource languages: A use case of Persian language." PLOS ONE 19, no. 6 (2024): e0304166. http://dx.doi.org/10.1371/journal.pone.0304166.
Повний текст джерелаLaskar, Sahinur Rahman, Abdullah Faiz Ur Rahman Khilji, Partha Pakray, and Sivaji Bandyopadhyay. "Improved neural machine translation for low-resource English–Assamese pair." Journal of Intelligent & Fuzzy Systems 42, no. 5 (2022): 4727–38. http://dx.doi.org/10.3233/jifs-219260.
Повний текст джерелаA. Baldha, Nirav. "Question Answering for Low Resource Languages Using Natural Language Processing." International Journal of Scientific Research and Engineering Trends 8, no. 2 (2022): 1122–26. http://dx.doi.org/10.61137/ijsret.vol.8.issue2.207.
Повний текст джерелаShikali, Casper S., and Refuoe Mokhosi. "Enhancing African low-resource languages: Swahili data for language modelling." Data in Brief 31 (August 2020): 105951. http://dx.doi.org/10.1016/j.dib.2020.105951.
Повний текст джерелаXiao, Yubei, Ke Gong, Pan Zhou, Guolin Zheng, Xiaodan Liang, and Liang Lin. "Adversarial Meta Sampling for Multilingual Low-Resource Speech Recognition." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 16 (2021): 14112–20. http://dx.doi.org/10.1609/aaai.v35i16.17661.
Повний текст джерелаChen, Siqi, Yijie Pei, Zunwang Ke, and Wushour Silamu. "Low-Resource Named Entity Recognition via the Pre-Training Model." Symmetry 13, no. 5 (2021): 786. http://dx.doi.org/10.3390/sym13050786.
Повний текст джерелаVinodh Gunnam. "Tackling Low-Resource Languages: Efficient Transfer Learning Techniques for Multilingual NLP." International Journal for Research Publication and Seminar 13, no. 4 (2022): 354–59. http://dx.doi.org/10.36676/jrps.v13.i4.1601.
Повний текст джерелаThakkar, Gaurish, Nives Mikelić Preradović, and Marko Tadić. "Transferring Sentiment Cross-Lingually within and across Same-Family Languages." Applied Sciences 14, no. 13 (2024): 5652. http://dx.doi.org/10.3390/app14135652.
Повний текст джерелаBajpai, Ashutosh, and Tanmoy Chakraborty. "Multilingual LLMs Inherently Reward In-Language Time-Sensitive Semantic Alignment for Low-Resource Languages." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 22 (2025): 23469–77. https://doi.org/10.1609/aaai.v39i22.34515.
Повний текст джерелаEt. al., Syed Abdul Basit Andrabi,. "A Review of Machine Translation for South Asian Low Resource Languages." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 5 (2021): 1134–47. http://dx.doi.org/10.17762/turcomat.v12i5.1777.
Повний текст джерелаKalluri, Kartheek. "ADAPTING LLMs FOR LOW RESOURCE LANGUAGES-TECHNIQUES AND ETHICAL CONSIDERATIONS." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 12 (2024): 1–6. https://doi.org/10.55041/isjem00140.
Повний текст джерелаRakhimova, Diana, Aidana Karibayeva, and Assem Turarbek. "The Task of Post-Editing Machine Translation for the Low-Resource Language." Applied Sciences 14, no. 2 (2024): 486. http://dx.doi.org/10.3390/app14020486.
Повний текст джерелаKim, Bosung, Juae Kim, Youngjoong Ko, and Jungyun Seo. "Commonsense Knowledge Augmentation for Low-Resource Languages via Adversarial Learning." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 7 (2021): 6393–401. http://dx.doi.org/10.1609/aaai.v35i7.16793.
Повний текст джерелаZhang, Mozhi, Yoshinari Fujinuma, and Jordan Boyd-Graber. "Exploiting Cross-Lingual Subword Similarities in Low-Resource Document Classification." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 9547–54. http://dx.doi.org/10.1609/aaai.v34i05.6500.
Повний текст джерелаZENNAKI, O., N. SEMMAR, and L. BESACIER. "A neural approach for inducing multilingual resources and natural language processing tools for low-resource languages." Natural Language Engineering 25, no. 1 (2018): 43–67. http://dx.doi.org/10.1017/s1351324918000293.
Повний текст джерелаMeeus, Quentin, Marie-Francine Moens, and Hugo Van hamme. "Bidirectional Representations for Low-Resource Spoken Language Understanding." Applied Sciences 13, no. 20 (2023): 11291. http://dx.doi.org/10.3390/app132011291.
Повний текст джерелаBerthelier, Benoit. "Division and the Digital Language Divide: A Critical Perspective on Natural Language Processing Resources for the South and North Korean Languages." Korean Studies 47, no. 1 (2023): 243–73. http://dx.doi.org/10.1353/ks.2023.a908624.
Повний текст джерелаMi, Chenggang, Shaolin Zhu, and Rui Nie. "Improving Loanword Identification in Low-Resource Language with Data Augmentation and Multiple Feature Fusion." Computational Intelligence and Neuroscience 2021 (April 8, 2021): 1–9. http://dx.doi.org/10.1155/2021/9975078.
Повний текст джерелаShi, Xiayang, Xinyi Liu, Zhenqiang Yu, Pei Cheng, and Chun Xu. "Extracting Parallel Sentences from Low-Resource Language Pairs with Minimal Supervision." Journal of Physics: Conference Series 2171, no. 1 (2022): 012044. http://dx.doi.org/10.1088/1742-6596/2171/1/012044.
Повний текст джерелаSabouri, Sadra, Elnaz Rahmati Rahmati, Soroush Gooran, and Hossein Sameti. "naab: A ready-to-use plug-and-play corpus for Farsi." Journal of Artificial Intelligence, Applications, and Innovations 1, no. 2 (2024): 1–8. https://doi.org/10.61838/jaiai.1.2.1.
Повний текст джерелаAdjeisah, Michael, Guohua Liu, Douglas Omwenga Nyabuga, Richard Nuetey Nortey, and Jinling Song. "Pseudotext Injection and Advance Filtering of Low-Resource Corpus for Neural Machine Translation." Computational Intelligence and Neuroscience 2021 (April 11, 2021): 1–10. http://dx.doi.org/10.1155/2021/6682385.
Повний текст джерелаVisser, Ruan, Trieko Grobler, and Marcel Dunaiski. "Insights into Low-Resource Language Modelling: Improving Model Performances for South African Languages." JUCS - Journal of Universal Computer Science 30, no. 13 (2024): 1849–71. https://doi.org/10.3897/jucs.118889.
Повний текст джерелаVisser, Ruan, Trieko Grobler, and Marcel Dunaiski. "Insights into Low-Resource Language Modelling: Improving Model Performances for South African Languages." JUCS - Journal of Universal Computer Science 30, no. (13) (2024): 1849–71. https://doi.org/10.3897/jucs.118889.
Повний текст джерелаXiao, Jingxuan, and Jiawei Wu. "Transfer Learning for Cross-Language Natural Language Processing Models." Journal of Computer Technology and Applied Mathematics 1, no. 3 (2024): 30–38. https://doi.org/10.5281/zenodo.13366733.
Повний текст джерелаSupriya, Musica, U. Dinesh Acharya, and Ashalatha Nayak. "Enhancing Neural Machine Translation Quality for Kannada–Tulu Language Pairs through Transformer Architecture: A Linguistic Feature Integration." Designs 8, no. 5 (2024): 100. http://dx.doi.org/10.3390/designs8050100.
Повний текст джерелаV Kadam, Ashlesha. "Natural Language Understanding of Low-Resource Languages in Voice Assistants: Advancements, Challenges and Mitigation Strategies." International Journal of Language, Literature and Culture 3, no. 5 (2023): 20–23. http://dx.doi.org/10.22161/ijllc.3.5.3.
Повний текст джерелаZhu, ShaoLin, Xiao Li, YaTing Yang, Lei Wang, and ChengGang Mi. "A Novel Deep Learning Method for Obtaining Bilingual Corpus from Multilingual Website." Mathematical Problems in Engineering 2019 (January 10, 2019): 1–7. http://dx.doi.org/10.1155/2019/7495436.
Повний текст джерелаTela, Abrhalei, Abraham Woubie, and Ville Hautamäki. "Transferring monolingual model to low-resource language: the case of Tigrinya." Applied Computing and Intelligence 4, no. 2 (2024): 184–94. http://dx.doi.org/10.3934/aci.2024011.
Повний текст джерелаWu, Yike, Shiwan Zhao, Ying Zhang, Xiaojie Yuan, and Zhong Su. "When Pairs Meet Triplets: Improving Low-Resource Captioning via Multi-Objective Optimization." ACM Transactions on Multimedia Computing, Communications, and Applications 18, no. 3 (2022): 1–20. http://dx.doi.org/10.1145/3492325.
Повний текст джерелаGrönroos, Stig-Arne, Kristiina Jokinen, Katri Hiovain, Mikko Kurimo, and Sami Virpioja. "Low-Resource Active Learning of North Sámi Morphological Segmentation." Septentrio Conference Series, no. 2 (June 17, 2015): 20. http://dx.doi.org/10.7557/5.3465.
Повний текст джерелаChaka, Chaka. "Currently Available GenAI-Powered Large Language Models and Low-Resource Languages: Any Offerings? Wait Until You See." International Journal of Learning, Teaching and Educational Research 23, no. 12 (2024): 148–73. https://doi.org/10.26803/ijlter.23.12.9.
Повний текст джерелаMurakami, Yohei. "Indonesia Language Sphere: an ecosystem for dictionary development for low-resource languages." Journal of Physics: Conference Series 1192 (March 2019): 012001. http://dx.doi.org/10.1088/1742-6596/1192/1/012001.
Повний текст джерелаPakray, Partha, Alexander Gelbukh, and Sivaji Bandyopadhyay. "Preface: Special issue on Natural Language Processing applications for low-resource languages." Natural Language Processing 31, no. 2 (2025): 181–82. https://doi.org/10.1017/nlp.2024.34.
Повний текст джерелаChen, Xilun, Yu Sun, Ben Athiwaratkun, Claire Cardie, and Kilian Weinberger. "Adversarial Deep Averaging Networks for Cross-Lingual Sentiment Classification." Transactions of the Association for Computational Linguistics 6 (December 2018): 557–70. http://dx.doi.org/10.1162/tacl_a_00039.
Повний текст джерела