Academic literature on the topic 'Quantum natural language processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Quantum natural language processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Quantum natural language processing"

1

Guarasci, Raffaele, Giuseppe De Pietro, and Massimo Esposito. "Quantum Natural Language Processing: Challenges and Opportunities." Applied Sciences 12, no. 11 (2022): 5651. http://dx.doi.org/10.3390/app12115651.

Full text
Abstract:
The meeting between Natural Language Processing (NLP) and Quantum Computing has been very successful in recent years, leading to the development of several approaches of the so-called Quantum Natural Language Processing (QNLP). This is a hybrid field in which the potential of quantum mechanics is exploited and applied to critical aspects of language processing, involving different NLP tasks. Approaches developed so far span from those that demonstrate the quantum advantage only at the theoretical level to the ones implementing algorithms on quantum hardware. This paper aims to list the approaches developed so far, categorizing them by type, i.e., theoretical work and those implemented on classical or quantum hardware; by task, i.e., general purpose such as syntax-semantic representation or specific NLP tasks, like sentiment analysis or question answering; and by the resource used in the evaluation phase, i.e., whether a benchmark dataset or a custom one has been used. The advantages offered by QNLP are discussed, both in terms of performance and methodology, and some considerations about the possible usage QNLP approaches in the place of state-of-the-art deep learning-based ones are given.
APA, Harvard, Vancouver, ISO, and other styles
2

Rai, Anshuman. "A Review Article on Quantum Natural Language Processing." International Journal for Research in Applied Science and Engineering Technology 10, no. 1 (2022): 1588–94. http://dx.doi.org/10.22214/ijraset.2022.40103.

Full text
Abstract:
Abstract: Quantum Natural Language Processing is the implementation of NLP algorithms on quantum hardware or alternatively on hybrid quantum-classical hardware. NLP has been a heavily researched and implemented topic of the past few decades and the most recent developments using new techniques and the power of deep learning have made huge strides in the field. But for all this new development, there is a looming possibility of greater achievements in the form of the rising field of quantum computing which is yet to see its potential come to fruition. A gaping hole in the implementation process of NLP systems is the computing power required to train deep learning and Natural Language Processing models which makes the development of such models time consuming and power hungry. The huge leap in parallel computing power that quantum computers provide gives us immense opportunities to accelerate the training of deep and complex models. Such techniques will help organizations with access to quantum hardware to be able to use quantum circuits to either train a complete model or use a classical system like the norm but outsource all of the most computationally heavy part of the process to quantum hardware which will provide exponential speed up to the development of conversational AI models. Keywords: Quantum computing, Natural Language Processing, Quantum Machine Learning, Quantum Natural Language Processing, Noisy Intermediate-Scale Quantum systems, Lambeq, hybrid classical-quantum systems, DisCoCat
APA, Harvard, Vancouver, ISO, and other styles
3

Meichanetzidis, Konstantinos, Stefano Gogioso, Giovanni de Felice, Nicolò Chiappori, Alexis Toumi, and Bob Coecke. "Quantum Natural Language Processing on Near-Term Quantum Computers." Electronic Proceedings in Theoretical Computer Science 340 (September 6, 2021): 213–29. http://dx.doi.org/10.4204/eptcs.340.11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zeng, William, and Bob Coecke. "Quantum Algorithms for Compositional Natural Language Processing." Electronic Proceedings in Theoretical Computer Science 221 (August 2, 2016): 67–75. http://dx.doi.org/10.4204/eptcs.221.8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kavya, J., K. Krishnaveni, K. Namratha, D. Lakshmi Gowri, and M. Madhavi. "Semantic Analysis of Auto-Generated Sentences Using Quantum Natural Language Processing." International Journal for Research in Applied Science and Engineering Technology 12, no. 11 (2024): 1469–72. http://dx.doi.org/10.22214/ijraset.2024.65380.

Full text
Abstract:
Abstract: Quantum Natural Language Processing (QNLP) represents a pioneering approach to understanding and analyzing natural language by leveraging the principles of quantum computing. This project aims to explore the semantic analysis of autogenerated sentences using QNLP techniques. Traditional Natural Language Processing (NLP) methods have achieved significant milestones in language understanding and generation; however, they often struggle with the intricacies of context, ambiguity, and the vast computational resources required for complex tasks. QNLP offers a novel paradigm by utilizing the superposition and entanglement properties of quantum states, potentially revolutionizing how semantic meaning is extracted and processed. The primary objective of this project is to develop a QNLP framework capable of performing semantic analysis on autogenerated sentences. We will investigate the application of quantum algorithms and quantum machine learning models to decode the semantic structure and context of sentences generated by language models. This involves encoding textual data into quantum states, applying quantum gates and circuits to manipulate these states, and measuring the outcomes to interpret semantic information. By integrating quantum computing with NLP, this project seeks to achieve higher accuracy and efficiency in understanding complex language patterns. The anticipated outcomes include a comprehensive evaluation of QNLP's performance in semantic analysis, a comparison with classical NLP methods, and potential advancements in language processing tasks such as sentiment analysis, language translation, and text summarization. This project will contribute to the growing body of research at the intersection of quantum computing and artificial intelligence, paving the way for innovative solutions in the field of natural language processing.
APA, Harvard, Vancouver, ISO, and other styles
6

Wan, Meiyan. "Quantum mechanics and statistical physics: Novel frameworks for enhancing natural language processing." Applied and Computational Engineering 102, no. 1 (2024): 1–6. http://dx.doi.org/10.54254/2755-2721/102/20240912.

Full text
Abstract:
Abstract. This article explores the pioneering application of principles from quantum mechanics and statistical mechanics to the field of natural language processing (NLP). By drawing analogies between physical phenomena such as quantum entanglement, phase transitions, and statistical ensembles, and linguistic concepts like semantic relationships, language use dynamics, and lexical diversity, we offer a novel perspective on language analysis and processing. Quantum linguistic models, leveraging the intricacies of entanglement and quantum probability, provide a framework for understanding complex semantic networks and enhancing computational efficiency through quantum computing. Meanwhile, statistical mechanics inspires models for capturing lexical diversity and understanding the evolution of language patterns, akin to phase transitions in physical systems. This interdisciplinary approach not only deepens our understanding of linguistic phenomena but also introduces advanced mathematical and computational techniques to improve NLP tasks.
APA, Harvard, Vancouver, ISO, and other styles
7

O’Riordan, Lee J., Myles Doyle, Fabio Baruffa, and Venkatesh Kannan. "A hybrid classical-quantum workflow for natural language processing." Machine Learning: Science and Technology 2, no. 1 (2020): 015011. http://dx.doi.org/10.1088/2632-2153/abbd2e.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yao, Ben, Prayag Tiwari, and Qiuchi Li. "Self-supervised pre-trained neural network for quantum natural language processing." Neural Networks 184 (April 2025): 107004. https://doi.org/10.1016/j.neunet.2024.107004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tangpanitanon, Jirawat, Chanatip Mangkang, Pradeep Bhadola, Yuichiro Minato, Dimitris G. Angelakis, and Thiparat Chotibut. "Explainable natural language processing with matrix product states." New Journal of Physics 24, no. 5 (2022): 053032. http://dx.doi.org/10.1088/1367-2630/ac6232.

Full text
Abstract:
Abstract Despite empirical successes of recurrent neural networks (RNNs) in natural language processing (NLP), theoretical understanding of RNNs is still limited due to intrinsically complex non-linear computations. We systematically analyze RNNs’ behaviors in a ubiquitous NLP task, the sentiment analysis of movie reviews, via the mapping between a class of RNNs called recurrent arithmetic circuits (RACs) and a matrix product state. Using the von-Neumann entanglement entropy (EE) as a proxy for information propagation, we show that single-layer RACs possess a maximum information propagation capacity, reflected by the saturation of the EE. Enlarging the bond dimension beyond the EE saturation threshold does not increase model prediction accuracies, so a minimal model that best estimates the data statistics can be inferred. Although the saturated EE is smaller than the maximum EE allowed by the area law, our minimal model still achieves ∼ 99 % training accuracies in realistic sentiment analysis data sets. Thus, low EE is not a warrant against the adoption of single-layer RACs for NLP. Contrary to a common belief that long-range information propagation is the main source of RNNs’ successes, we show that single-layer RACs harness high expressiveness from the subtle interplay between the information propagation and the word vector embeddings. Our work sheds light on the phenomenology of learning in RACs, and more generally on the explainability of RNNs for NLP, using tools from many-body quantum physics.
APA, Harvard, Vancouver, ISO, and other styles
10

Skipper, Katherine. "Ask me anything: Muhammad Hamza Waseem." Physics World 38, no. 5 (2025): 45ii. https://doi.org/10.1088/2058-7058/38/05/31.

Full text
Abstract:
Muhammad Hamza Waseem is a research scientist at Quantinuum, where he works on quantum natural language processing as well as quantum physics education and outreach. His other research interests include quantum foundations, applied category theory and mathematical linguistics.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Quantum natural language processing"

1

Chen, Joseph C. H. "Quantum computation and natural language processing." [S.l.] : [s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=965581020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pang, Bo. "Handwriting Chinese character recognition based on quantum particle swarm optimization support vector machine." Thesis, University of Macau, 2018. http://umaclib3.umac.mo/record=b3950620.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kartsaklis, Dimitrios. "Compositional distributional semantics with compact closed categories and Frobenius algebras." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:1f6647ef-4606-4b85-8f3b-c501818780f2.

Full text
Abstract:
The provision of compositionality in distributional models of meaning, where a word is represented as a vector of co-occurrence counts with every other word in the vocabulary, offers a solution to the fact that no text corpus, regardless of its size, is capable of providing reliable co-occurrence statistics for anything but very short text constituents. The purpose of a compositional distributional model is to provide a function that composes the vectors for the words within a sentence, in order to create a vectorial representation that re ects its meaning. Using the abstract mathematical framework of category theory, Coecke, Sadrzadeh and Clark showed that this function can directly depend on the grammatical structure of the sentence, providing an elegant mathematical counterpart of the formal semantics view. The framework is general and compositional but stays abstract to a large extent. This thesis contributes to ongoing research related to the above categorical model in three ways: Firstly, I propose a concrete instantiation of the abstract framework based on Frobenius algebras (joint work with Sadrzadeh). The theory improves shortcomings of previous proposals, extends the coverage of the language, and is supported by experimental work that improves existing results. The proposed framework describes a new class of compositional models thatfind intuitive interpretations for a number of linguistic phenomena. Secondly, I propose and evaluate in practice a new compositional methodology which explicitly deals with the different levels of lexical ambiguity (joint work with Pulman). A concrete algorithm is presented, based on the separation of vector disambiguation from composition in an explicit prior step. Extensive experimental work shows that the proposed methodology indeed results in more accurate composite representations for the framework of Coecke et al. in particular and every other class of compositional models in general. As a last contribution, I formalize the explicit treatment of lexical ambiguity in the context of the categorical framework by resorting to categorical quantum mechanics (joint work with Coecke). In the proposed extension, the concept of a distributional vector is replaced with that of a density matrix, which compactly represents a probability distribution over the potential different meanings of the specific word. Composition takes the form of quantum measurements, leading to interesting analogies between quantum physics and linguistics.
APA, Harvard, Vancouver, ISO, and other styles
4

Matsubara, Shigeki. "Corpus-based Natural Language Processing." INTELLIGENT MEDIA INTEGRATION NAGOYA UNIVERSITY / COE, 2004. http://hdl.handle.net/2237/10355.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Smith, Sydney. "Approaches to Natural Language Processing." Scholarship @ Claremont, 2018. http://scholarship.claremont.edu/cmc_theses/1817.

Full text
Abstract:
This paper explores topic modeling through the example text of Alice in Wonderland. It explores both singular value decomposition as well as non-­‐‑negative matrix factorization as methods for feature extraction. The paper goes on to explore methods for partially supervised implementation of topic modeling through introducing themes. A large portion of the paper also focuses on implementation of these techniques in python as well as visualizations of the results which use a combination of python, html and java script along with the d3 framework. The paper concludes by presenting a mixture of SVD, NMF and partially-­‐‑supervised NMF as a possible way to improve topic modeling.
APA, Harvard, Vancouver, ISO, and other styles
6

Strandberg, Aron, and Patrik Karlström. "Processing Natural Language for the Spotify API : Are sophisticated natural language processing algorithms necessary when processing language in a limited scope?" Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-186867.

Full text
Abstract:
Knowing whether you can implement something complex in a simple way in your application is always of interest. A natural language interface is some- thing that could theoretically be implemented in a lot of applications but the complexity of most natural language processing algorithms is a limiting factor. The problem explored in this paper is whether a simpler algorithm that doesn’t make use of convoluted statistical models and machine learning can be good enough. We implemented two algorithms, one utilizing Spotify’s own search and one with a more accurate, o✏ine search. With the best precision we could muster being 81% at an average of 2,28 seconds per query this is not a viable solution for a complete and satisfactory user experience. Further work could push the performance into an acceptable range.
APA, Harvard, Vancouver, ISO, and other styles
7

Knight, Sylvia Frances. "Natural language processing for aerospace documentation." Thesis, University of Cambridge, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.621395.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Naphtal, Rachael (Rachael M. ). "Natural language processing based nutritional application." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/100640.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2015.<br>This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 67-68).<br>The ability to accurately and eciently track nutritional intake is a powerful tool in combating obesity and other food related diseases. Currently, many methods used for this task are time consuming or easily abandoned; however, a natural language based application that converts spoken text to nutritional information could be a convenient and eective solution. This thesis describes the creation of an application that translates spoken food diaries into nutritional database entries. It explores dierent methods for solving the problem of converting brands, descriptions and food item names into entries in nutritional databases. Specifically, we constructed a cache of over 4,000 food items, and also created a variety of methods to allow refinement of database mappings. We also explored methods of dealing with ambiguous quantity descriptions and the mapping of spoken quantity values to numerical units. When assessed by 500 users entering their daily meals on Amazon Mechanical Turk, the system was able to map 83.8% of the correctly interpreted spoken food items to relevant nutritional database entries. It was also able to nd a logical quantity for 92.2% of the correct food entries. Overall, this system shows a signicant step towards the intelligent conversion of spoken food diaries to actual nutritional feedback.<br>by Rachael Naphtal.<br>M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
9

Eriksson, Simon. "COMPARING NATURAL LANGUAGE PROCESSING TO STRUCTURED QUERY LANGUAGE ALGORITHMS." Thesis, Umeå universitet, Institutionen för datavetenskap, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-163310.

Full text
Abstract:
Using natural language processing to create Structured Query Language (SQL) queries has many benefi€ts in theory. Even though SQL is an expressive and powerful language it requires certain technical knowledge to use. An interface effectively utilizing natural language processing would instead allow the user to communicate with the SQL database as if they were communicating with another human being. In this paper I compare how two of the currently most advanced open source algorithms (TypeSQL and SyntaxSQL) in this €field can understandadvanced SQL. I show that SyntaxSQL is signi€cantly more accurate but makes some sacri€ces in execution time compared to TypeSQL.
APA, Harvard, Vancouver, ISO, and other styles
10

Kesarwani, Vaibhav. "Automatic Poetry Classification Using Natural Language Processing." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/37309.

Full text
Abstract:
Poetry, as a special form of literature, is crucial for computational linguistics. It has a high density of emotions, figures of speech, vividness, creativity, and ambiguity. Poetry poses a much greater challenge for the application of Natural Language Processing algorithms than any other literary genre. Our system establishes a computational model that classifies poems based on similarity features like rhyme, diction, and metaphor. For rhyme analysis, we investigate the methods used to classify poems based on rhyme patterns. First, the overview of different types of rhymes is given along with the detailed description of detecting rhyme type and sub-types by the application of a pronunciation dictionary on our poetry dataset. We achieve an accuracy of 96.51% in identifying rhymes in poetry by applying a phonetic similarity model. Then we achieve a rhyme quantification metric RhymeScore based on the matching phonetic transcription of each poem. We also develop an application for the visualization of this quantified RhymeScore as a scatter plot in 2 or 3 dimensions. For diction analysis, we investigate the methods used to classify poems based on diction. First the linguistic quantitative and semantic features that constitute diction are enumerated. Then we investigate the methodology used to compute these features from our poetry dataset. We also build a word embeddings model on our poetry dataset with 1.5 million words in 100 dimensions and do a comparative analysis with GloVe embeddings. Metaphor is a part of diction, but as it is a very complex topic in its own right, we address it as a stand-alone issue and develop several methods for it. Previous work on metaphor detection relies on either rule-based or statistical models, none of them applied to poetry. Our methods focus on metaphor detection in a poetry corpus, but we test on non-poetry data as well. We combine rule-based and statistical models (word embeddings) to develop a new classification system. Our first metaphor detection method achieves a precision of 0.759 and a recall of 0.804 in identifying one type of metaphor in poetry, by using a Support Vector Machine classifier with various types of features. Furthermore, our deep learning model based on a Convolutional Neural Network achieves a precision of 0.831 and a recall of 0.836 for the same task. We also develop an application for generic metaphor detection in any type of natural text.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Quantum natural language processing"

1

Filgueiras, M., L. Damas, N. Moreira, and A. P. Tomás, eds. Natural Language Processing. Springer Berlin Heidelberg, 1991. http://dx.doi.org/10.1007/3-540-53678-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lee, Raymond S. T. Natural Language Processing. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-1999-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lee, Raymond. Natural Language Processing. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-3208-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kulkarni, Akshay, and Adarsha Shivananda. Natural Language Processing Recipes. Apress, 2021. http://dx.doi.org/10.1007/978-1-4842-7351-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kulkarni, Akshay, Adarsha Shivananda, and Anoosh Kulkarni. Natural Language Processing Projects. Apress, 2022. http://dx.doi.org/10.1007/978-1-4842-7386-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Søgaard, Anders. Explainable Natural Language Processing. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-031-02180-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tapsai, Chalermpol, Herwig Unger, and Phayung Meesad. Thai Natural Language Processing. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-56235-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Oflazer, Kemal, and Murat Saraçlar, eds. Turkish Natural Language Processing. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-90165-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sharkey, Noel, ed. Connectionist Natural Language Processing. Springer Netherlands, 1992. http://dx.doi.org/10.1007/978-94-011-2624-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kulkarni, Akshay, and Adarsha Shivananda. Natural Language Processing Recipes. Apress, 2019. http://dx.doi.org/10.1007/978-1-4842-4267-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Quantum natural language processing"

1

Pimpalshende, Anjusha, Madhu Bala Myneni, and Sarat Chandra Nayak. "Quantum Natural Language Processing: Revolutionizing Language Processing." In Intelligent Systems Reference Library. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-89905-8_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Miranda, Eduardo Reck, Richie Yeung, Anna Pearson, Konstantinos Meichanetzidis, and Bob Coecke. "A Quantum Natural Language Processing Approach to Musical Intelligence." In Quantum Computer Music. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-13909-3_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ganguly, Srinjoy, Sai Nandan Morapakula, and Luis Gerardo Ayala Bertel. "An Introduction to Quantum Natural Language Processing (QNLP)." In Coded Leadership. CRC Press, 2022. http://dx.doi.org/10.1201/9781003244660-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kayal, Mrinmoy, Mohinikanta Sahoo, Jayadeep Pati, and Ranjan Kumar Behera. "Quantum-Inspired Aspect-Based Sentiment Analysis Using Natural Language Processing." In Intelligent Systems Reference Library. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-89905-8_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Díaz-Ortiz, J. Ismael, Axel Villanueva, and Francisco Delgado. "Strongly Entangling Neural Network: Quantum-Classical Hybrid Model for Quantum Natural Language Processing." In Springer Proceedings in Mathematics & Statistics. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-52965-8_40.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Akter, Mst Shapna, Hossain Shahriar, and Zakirul Alam Bhuiya. "Automated Vulnerability Detection in Source Code Using Quantum Natural Language Processing." In Communications in Computer and Information Science. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-0272-9_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rajashekharaiah, K. M. M., Satyadhyan Chickerur, Goutam Hegde, Subrahmanya L. Bhat, and Shubham Annappa Sali. "Sentence Classification Using Quantum Natural Language Processing and Comparison of Optimization Methods." In Communications in Computer and Information Science. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-35644-5_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bouakba, Yousra, and Hacene Belhadef. "Quantum Natural Language Processing: A New and Promising Way to Solve NLP Problems." In Communications in Computer and Information Science. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-28540-0_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alphin Ezhil Manuel, M. L., and K. Raja. "Experimental Analysis on Quantum Machine Learning Models for Part-of-Speech Tagging in Natural Language Processing." In Lecture Notes in Networks and Systems. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-3352-4_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Peral-García, David, Juan Cruz-Benito, and Francisco José García-Peñalvo. "Development of Algorithms and Methods for the Simulation and Improvement in the Quantum Natural Language Processing Area." In Proceedings TEEM 2022: Tenth International Conference on Technological Ecosystems for Enhancing Multiculturality. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-0942-1_130.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Quantum natural language processing"

1

Silver, Daniel, Aditya Ranjan, Rakesh Achutha, Tirthak Patel, and Devesh Tiwari. "LEXIQL: Quantum Natural Language Processing on NISQ-era Machines." In SC24: International Conference for High Performance Computing, Networking, Storage and Analysis. IEEE, 2024. https://doi.org/10.1109/sc41406.2024.00073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Uotila, Valter. "Quantum Natural Language Processing Application for Estimating SQL Query Metrics." In 2024 IEEE International Conference on Quantum Computing and Engineering (QCE). IEEE, 2024. https://doi.org/10.1109/qce60285.2024.10321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kumar, Keerthipati, Mohammad Haider Syed, Saumya Bhargava, Lalit Lalitav Mohakud, Mohammad Serajuddin, and Melanie Lourens. "Natural Language Processing: Bridging the Gap between Human Language and Machine Understanding." In 2024 International Conference on Trends in Quantum Computing and Emerging Business Technologies (TQCEBT). IEEE, 2024. http://dx.doi.org/10.1109/tqcebt59414.2024.10545250.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Yan, Kehuan, and Wenjun Wang. "Quantum-inspired Neural Network with Lindblad Master Equation for Sentiment Analysis." In 2024 6th International Conference on Natural Language Processing (ICNLP). IEEE, 2024. http://dx.doi.org/10.1109/icnlp60986.2024.10692739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Xu, Wenduan, Stephen Clark, Douglas Brown, Gabriel Matos, and Konstantinos Meichanetzidis. "Quantum Recurrent Architectures for Text Classification." In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.emnlp-main.1000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Tingyu, Yingying Wei, and Jingtao Wang. "Research on Distributional Compositional Categorical Model in Both Classical and Quantum Natural Language Processing." In 2024 IEEE/ACIS 27th International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD). IEEE, 2024. http://dx.doi.org/10.1109/snpd61259.2024.10673943.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Thota, Srirajarajeswari, and Sandeep Kumar Dash. "A Comparison of Traditional Natural Language Processing Methods for Emotion Recognition Using Quantum Computing." In 2025 International Conference on Machine Learning and Autonomous Systems (ICMLAS). IEEE, 2025. https://doi.org/10.1109/icmlas64557.2025.10968522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sato, Yudai, and Takayuki Kawahara. "Quantum Circuit Learning Enhanced by Dynamic Circuit." In 2024 19th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP). IEEE, 2024. https://doi.org/10.1109/isai-nlp64410.2024.10799377.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bharti, Suman, Dan Chia-Tien Lo, and Yong Shi. "Enhancing Contextual Understanding in Knowledge Graphs: Integration of Quantum Natural Language Processing with Neo4j LLM Knowledge Graph." In 2024 IEEE International Conference on Big Data (BigData). IEEE, 2024. https://doi.org/10.1109/bigdata62323.2024.10826002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Philip, Shynu, Christine Ann Thomas, and Nitish Bhimrao Kumbhar. "Advancing the Evaluation of Oral Fluency in English for Specific Classrooms: Harnessing Natural Language Processing Tools for Enhanced Assessment." In 2024 International Conference on Trends in Quantum Computing and Emerging Business Technologies (TQCEBT). IEEE, 2024. http://dx.doi.org/10.1109/tqcebt59414.2024.10545199.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Quantum natural language processing"

1

Pasupuleti, Murali Krishna. AI-Driven Automation: Transforming Industry 5.0 withMachine Learning and Advanced Technologies. National Education Services, 2025. https://doi.org/10.62311/nesx/rr225.

Full text
Abstract:
Abstract: This article delves into the transformative role of artificial intelligence (AI) and machine learning (ML) in shaping Industry 5.0, a paradigm centered on human- machine collaboration, sustainability, and resilient industrial ecosystems. Beginning with the evolution from Industry 4.0 to Industry 5.0, it examines core AI technologies, including predictive analytics, natural language processing, and computer vision, which drive advancements in manufacturing, quality control, and adaptive logistics. Key discussions include the integration of collaborative robots (cobots) that enhance human productivity, AI-driven sustainability practices for energy and resource efficiency, and predictive maintenance models that reduce downtime. Addressing ethical challenges, the Article highlights the importance of data privacy, unbiased algorithms, and the environmental responsibility of intelligent automation. Through case studies across manufacturing, healthcare, and energy sectors, readers gain insights into real-world applications of AI and ML, showcasing their impact on efficiency, quality, and safety. The Article concludes with future directions, emphasizing emerging technologies like quantum computing, human-machine synergy, and the sustainable vision for Industry 5.0, where intelligent automation not only drives innovation but also aligns with ethical and social values for a resilient industrial future. Keywords: Industry 5.0, intelligent automation, AI, machine learning, sustainability, human- machine collaboration, cobots, predictive maintenance, quality control, ethical AI, data privacy, Industry 4.0, computer vision, natural language processing, energy efficiency, adaptive logistics, environmental responsibility, industrial ecosystems, quantum computing.
APA, Harvard, Vancouver, ISO, and other styles
2

Steedman, Mark. Natural Language Processing. Defense Technical Information Center, 1994. http://dx.doi.org/10.21236/ada290396.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bergeaud, Antonin, Adam Jaffe, and Dimitris Papanikolaou. Natural Language Processing and Innovation Research. National Bureau of Economic Research, 2025. https://doi.org/10.3386/w33821.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Tratz, Stephen C. Arabic Natural Language Processing System Code Library. Defense Technical Information Center, 2014. http://dx.doi.org/10.21236/ada603814.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wilks, Yorick, Michael Coombs, Roger T. Hartley, and Dihong Qiu. Active Knowledge Structures for Natural Language Processing. Defense Technical Information Center, 1991. http://dx.doi.org/10.21236/ada245893.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Firpo, M. Natural Language Processing as a Discipline at LLNL. Office of Scientific and Technical Information (OSTI), 2005. http://dx.doi.org/10.2172/15015192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Anderson, Thomas. State of the Art of Natural Language Processing. Defense Technical Information Center, 1987. http://dx.doi.org/10.21236/ada188112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hobbs, Jerry R., Douglas E. Appelt, John Bear, Mabry Tyson, and David Magerman. Robust Processing of Real-World Natural-Language Texts. Defense Technical Information Center, 1991. http://dx.doi.org/10.21236/ada258837.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Neal, Jeannette G., Elissa L. Feit, Douglas J. Funke, and Christine A. Montgomery. An Evaluation Methodology for Natural Language Processing Systems. Defense Technical Information Center, 1992. http://dx.doi.org/10.21236/ada263301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lehnert, Wendy G. Using Case-Based Reasoning in Natural Language Processing. Defense Technical Information Center, 1993. http://dx.doi.org/10.21236/ada273538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!