Academic literature on the topic 'Pretrained language model'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Pretrained language model.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Pretrained language model"
Lee, Chanhee, Kisu Yang, Taesun Whang, Chanjun Park, Andrew Matteson, and Heuiseok Lim. "Exploring the Data Efficiency of Cross-Lingual Post-Training in Pretrained Language Models." Applied Sciences 11, no. 5 (2021): 1974. http://dx.doi.org/10.3390/app11051974.
Full textDe Coster, Mathieu, and Joni Dambre. "Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation." Information 13, no. 5 (2022): 220. http://dx.doi.org/10.3390/info13050220.
Full textKuwana, Ayato, Atsushi Oba, Ranto Sawai, and Incheon Paik. "Automatic Taxonomy Classification by Pretrained Language Model." Electronics 10, no. 21 (2021): 2656. http://dx.doi.org/10.3390/electronics10212656.
Full textLee, Eunchan, Changhyeon Lee, and Sangtae Ahn. "Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models." Applied Sciences 12, no. 9 (2022): 4522. http://dx.doi.org/10.3390/app12094522.
Full textWang, Canjun, Zhao Li, Tong Chen, Ruishuang Wang, and Zhengyu Ju. "Research on the Application of Prompt Learning Pretrained Language Model in Machine Translation Task with Reinforcement Learning." Electronics 12, no. 16 (2023): 3391. http://dx.doi.org/10.3390/electronics12163391.
Full textChen, Zhi, Yuncong Liu, Lu Chen, Su Zhu, Mengyue Wu, and Kai Yu. "OPAL: Ontology-Aware Pretrained Language Model for End-to-End Task-Oriented Dialogue." Transactions of the Association for Computational Linguistics 11 (2023): 68–84. http://dx.doi.org/10.1162/tacl_a_00534.
Full textXu, Canwen, and Julian McAuley. "A Survey on Model Compression and Acceleration for Pretrained Language Models." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (2023): 10566–75. http://dx.doi.org/10.1609/aaai.v37i9.26255.
Full textGu, Yang, and Yanke Hu. "Extractive Summarization with Very Deep Pretrained Language Model." International Journal of Artificial Intelligence & Applications 10, no. 02 (2019): 27–32. http://dx.doi.org/10.5121/ijaia.2019.10203.
Full textQi, Xianglong, Yang Gao, Ruibin Wang, Minghua Zhao, Shengjia Cui, and Mohsen Mortazavi. "Learning High-Order Semantic Representation for Intent Classification and Slot Filling on Low-Resource Language via Hypergraph." Mathematical Problems in Engineering 2022 (September 16, 2022): 1–16. http://dx.doi.org/10.1155/2022/8407713.
Full textWon, Hyun-Sik, Min-Ji Kim, Dohyun Kim, Hee-Soo Kim, and Kang-Min Kim. "University Student Dropout Prediction Using Pretrained Language Models." Applied Sciences 13, no. 12 (2023): 7073. http://dx.doi.org/10.3390/app13127073.
Full textDissertations / Theses on the topic "Pretrained language model"
Pelloin, Valentin. "La compréhension de la parole dans les systèmes de dialogues humain-machine à l'heure des modèles pré-entraînés." Electronic Thesis or Diss., Le Mans, 2024. http://www.theses.fr/2024LEMA1002.
Full textKulhánek, Jonáš. "End-to-end dialogové systémy s předtrénovanými jazykovými modely." Master's thesis, 2021. http://www.nusl.cz/ntk/nusl-448383.
Full textBooks on the topic "Pretrained language model"
Olgiati, Andrea. Pretrain Vision and Large Language Models in Python: End-To-end Techniques for Building and Deploying Foundation Models on AWS. de Gruyter GmbH, Walter, 2023.
Find full textOlgiati, Andrea. Pretrain Vision and Large Language Models in Python: End-To-end Techniques for Building and Deploying Foundation Models on AWS. Packt Publishing, Limited, 2023.
Find full textBook chapters on the topic "Pretrained language model"
Oba, Atsushi, Incheon Paik, and Ayato Kuwana. "Automatic Classification for Ontology Generation by Pretrained Language Model." In Advances and Trends in Artificial Intelligence. Artificial Intelligence Practices. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-79457-6_18.
Full textYan, Hao, and Yuhong Guo. "Lightweight Unsupervised Federated Learning with Pretrained Vision Language Model." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-82240-7_11.
Full textKong, Jun, Jin Wang, and Xuejie Zhang. "Accelerating Pretrained Language Model Inference Using Weighted Ensemble Self-distillation." In Natural Language Processing and Chinese Computing. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88480-2_18.
Full textLuo, Xudong, Zhiqi Deng, Kaili Sun, and Pingping Lin. "An Emotion-Aware Human-Computer Negotiation Model Powered by Pretrained Language Model." In Knowledge Science, Engineering and Management. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-5501-1_19.
Full textOzgul, Gizem, Şeyma Derdiyok, and Fatma Patlar Akbulut. "Turkish Sign Language Recognition Using a Fine-Tuned Pretrained Model." In Communications in Computer and Information Science. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-50920-9_6.
Full textLiu, Chenxu, Hongjie Fan, and Junfei Liu. "Span-Based Nested Named Entity Recognition with Pretrained Language Model." In Database Systems for Advanced Applications. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-73197-7_42.
Full textKucharavy, Andrei. "Adapting LLMs to Downstream Applications." In Large Language Models in Cybersecurity. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-54827-7_2.
Full textWei, Siwen, Chi Yuan, Zixuan Li, and Huaiyu Wang. "An Unsupervised Clinical Acronym Disambiguation Method Based on Pretrained Language Model." In Communications in Computer and Information Science. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-9864-7_18.
Full textLi, Sheng, and Jiyi Li. "Correction while Recognition: Combining Pretrained Language Model for Taiwan-Accented Speech Recognition." In Artificial Neural Networks and Machine Learning – ICANN 2023. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-44195-0_32.
Full textSilveira, Raquel, Caio Ponte, Vitor Almeida, Vládia Pinheiro, and Vasco Furtado. "LegalBert-pt: A Pretrained Language Model for the Brazilian Portuguese Legal Domain." In Intelligent Systems. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-45392-2_18.
Full textConference papers on the topic "Pretrained language model"
Zakizadeh, Mahdi, and Mohammad Taher Pilehvar. "Gender Encoding Patterns in Pretrained Language Model Representations." In Proceedings of the 5th Workshop on Trustworthy NLP (TrustNLP 2025). Association for Computational Linguistics, 2025. https://doi.org/10.18653/v1/2025.trustnlp-main.31.
Full textHaijie, Shen, and Madhavi Devaraj. "Enhancing sentiment analysis with language model adaptation: leveraging large-scale pretrained models." In Ninth International Symposium on Advances in Electrical, Electronics, and Computer Engineering (ISAEECE 2024), edited by Pierluigi Siano and Wenbing Zhao. SPIE, 2024. http://dx.doi.org/10.1117/12.3033425.
Full textDing, Shuai, Yifei Xu, Zhuang Lu, Fan Tang, Tong Li, and Jingguo Ge. "Power Microservices Troubleshooting by Pretrained Language Model with Multi-source Data." In 2024 IEEE International Symposium on Parallel and Distributed Processing with Applications (ISPA). IEEE, 2024. https://doi.org/10.1109/ispa63168.2024.00241.
Full textZeng, Zhuo, and Yu Wang. "Construction of pretrained language model based on big data of power equipment." In Fourth International Conference on Electronics Technology and Artificial Intelligence (ETAI 2025), edited by Shaohua Luo and Akash Saxena. SPIE, 2025. https://doi.org/10.1117/12.3068739.
Full textGinn, Michael, Lindia Tjuatja, Taiqi He, et al. "GlossLM: A Massively Multilingual Corpus and Pretrained Model for Interlinear Glossed Text." In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.emnlp-main.683.
Full textHou, Lijuan, Hanyan Qin, Xiankun Zhang, and Yiying Zhang. "Protein Function Prediction Based on the Pretrained Language Model ESM2 and Graph Convolutional Networks." In 2024 IEEE International Symposium on Parallel and Distributed Processing with Applications (ISPA). IEEE, 2024. https://doi.org/10.1109/ispa63168.2024.00242.
Full textHindi, Iman I., and Gheith A. Abandah. "Improving Arabic Dialect Text Classification by Finetuning A Pretrained Token-Free Large Language Model." In 2025 1st International Conference on Computational Intelligence Approaches and Applications (ICCIAA). IEEE, 2025. https://doi.org/10.1109/icciaa65327.2025.11013550.
Full textKatsumata, Satoru, and Mamoru Komachi. "Stronger Baselines for Grammatical Error Correction Using a Pretrained Encoder-Decoder Model." In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing. Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.aacl-main.83.
Full textSafikhani, Parisa, and David Broneske. "AutoML Meets Hugging Face: Domain-Aware Pretrained Model Selection for Text Classification." In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 4: Student Research Workshop). Association for Computational Linguistics, 2025. https://doi.org/10.18653/v1/2025.naacl-srw.45.
Full textZhang, Kaiwen, Feiyu Su, Yixiang Huang, Yanming Li, Fengqi Wu, and Yuhan Mao. "The Application of Fine-Tuning on Pretrained Language Model in Information Extraction for Fault Knowledge Graphs." In 2024 9th International Conference on Intelligent Computing and Signal Processing (ICSP). IEEE, 2024. http://dx.doi.org/10.1109/icsp62122.2024.10743881.
Full text