To see the other types of publications on this topic, follow the link: Neural Network Embeddings.

Journal articles on the topic 'Neural Network Embeddings'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Neural Network Embeddings.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Che, Feihu, Dawei Zhang, Jianhua Tao, Mingyue Niu, and Bocheng Zhao. "ParamE: Regarding Neural Network Parameters as Relation Embeddings for Knowledge Graph Completion." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 03 (2020): 2774–81. http://dx.doi.org/10.1609/aaai.v34i03.5665.

Full text
Abstract:
We study the task of learning entity and relation embeddings in knowledge graphs for predicting missing links. Previous translational models on link prediction make use of translational properties but lack enough expressiveness, while the convolution neural network based model (ConvE) takes advantage of the great nonlinearity fitting ability of neural networks but overlooks translational properties. In this paper, we propose a new knowledge graph embedding model called ParamE which can utilize the two advantages together. In ParamE, head entity embeddings, relation embeddings and tail entity e
APA, Harvard, Vancouver, ISO, and other styles
2

Huang, Junjie, Huawei Shen, Liang Hou, and Xueqi Cheng. "SDGNN: Learning Node Representation for Signed Directed Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 1 (2021): 196–203. http://dx.doi.org/10.1609/aaai.v35i1.16093.

Full text
Abstract:
Network embedding is aimed at mapping nodes in a network into low-dimensional vector representations. Graph Neural Networks (GNNs) have received widespread attention and lead to state-of-the-art performance in learning node representations. However, most GNNs only work in unsigned networks, where only positive links exist. It is not trivial to transfer these models to signed directed networks, which are widely observed in the real world yet less studied. In this paper, we first review two fundamental sociological theories (i.e., status theory and balance theory) and conduct empirical studies o
APA, Harvard, Vancouver, ISO, and other styles
3

Armandpour, Mohammadreza, Patrick Ding, Jianhua Huang, and Xia Hu. "Robust Negative Sampling for Network Embedding." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 3191–98. http://dx.doi.org/10.1609/aaai.v33i01.33013191.

Full text
Abstract:
Many recent network embedding algorithms use negative sampling (NS) to approximate a variant of the computationally expensive Skip-Gram neural network architecture (SGA) objective. In this paper, we provide theoretical arguments that reveal how NS can fail to properly estimate the SGA objective, and why it is not a suitable candidate for the network embedding problem as a distinct objective. We show NS can learn undesirable embeddings, as the result of the “Popular Neighbor Problem.” We use the theory to develop a new method “R-NS” that alleviates the problems of NS by using a more intelligent
APA, Harvard, Vancouver, ISO, and other styles
4

Srinidhi, K., T. L.S Tejaswi, CH Rama Rupesh Kumar, and I. Sai Siva Charan. "An Advanced Sentiment Embeddings with Applications to Sentiment Based Result Analysis." International Journal of Engineering & Technology 7, no. 2.32 (2018): 393. http://dx.doi.org/10.14419/ijet.v7i2.32.15721.

Full text
Abstract:
We propose an advanced well-trained sentiment analysis based adoptive analysis “word specific embedding’s, dubbed sentiment embedding’s”. Using available word and phrase embedded learning and trained algorithms mainly make use of contexts of terms but ignore the sentiment of texts and analyzing the process of word and text classifications. sentimental analysis on unlike words conveying same meaning matched to corresponding word vector. This problem is bridged by combining encoding opinion carrying text with sentiment embeddings words. But performing sentimental analysis on e-commerce, social n
APA, Harvard, Vancouver, ISO, and other styles
5

Kamath, S., K. G. Karibasappa, Anvitha Reddy, Arati M. Kallur, B. B. Priyanka, and B. P. Bhagya. "Improving the Relation Classification Using Convolutional Neural Network." IOP Conference Series: Materials Science and Engineering 1187, no. 1 (2021): 012004. http://dx.doi.org/10.1088/1757-899x/1187/1/012004.

Full text
Abstract:
Abstract Relation extraction has been the emerging research topic in the field of Natural Language Processing. The proposed work classifies the relations among the data considering the semantic relevance of words using word2vec embeddings towards training the convolutional neural network. We intended to use the semantic relevance of the words in the document to enrich the learning of the embeddings for improved classification. We designed a framework to automatically extract the relations between the entities using deep learning techniques. The framework includes pre-processing, extracting the
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Ruoyu. "Exploring the Impact of Word2Vec Embeddings Across Neural Network Architectures for Sentiment Analysis." Applied and Computational Engineering 97, no. 1 (2024): 93–98. http://dx.doi.org/10.54254/2755-2721/97/2024melb0085.

Full text
Abstract:
Abstract. Sentiment analysis is crucial for understanding public opinion, gauging customer satisfaction, and making informed business decisions based on the emotional tone of textual data. This study investigates the performance of different Word2Vec-based embedding strategies static, non-static, and multichannel for sentiment analysis across various neural network architectures, including Convolution Neural Networks (CNNs), Long Short-Term Memory (LSTM), and Gated Recurrent Units (GRUs). Despite the rise of advanced contextual embedding methods such as Bidirectional Encoder Representations fr
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Ruoyu. "Exploring the Impact of Word2Vec Embeddings Across Neural Network Architectures for Sentiment Analysis." Applied and Computational Engineering 94, no. 1 (2024): 106–11. http://dx.doi.org/10.54254/2755-2721/94/2024melb0085.

Full text
Abstract:
Abstract. Sentiment analysis is crucial for understanding public opinion, gauging customer satisfaction, and making informed business decisions based on the emotional tone of textual data. This study investigates the performance of different Word2Vec-based embedding strategies static, non-static, and multichannel for sentiment analysis across various neural network architectures, including Convolution Neural Networks (CNNs), Long Short-Term Memory (LSTM), and Gated Recurrent Units (GRUs). Despite the rise of advanced contextual embedding methods such as Bidirectional Encoder Representations fr
APA, Harvard, Vancouver, ISO, and other styles
8

Gu, Haishuo, Jinguang Sui, and Peng Chen. "Graph Representation Learning for Street-Level Crime Prediction." ISPRS International Journal of Geo-Information 13, no. 7 (2024): 229. http://dx.doi.org/10.3390/ijgi13070229.

Full text
Abstract:
In contemporary research, the street network emerges as a prominent and recurring theme in crime prediction studies. Meanwhile, graph representation learning shows considerable success, which motivates us to apply the methodology to crime prediction research. In this article, a graph representation learning approach is utilized to derive topological structure embeddings within the street network. Subsequently, a heterogeneous information network that incorporates both the street network and urban facilities is constructed, and embeddings through link prediction tasks are obtained. Finally, the
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Lei, Feng Qian, Jie Chen, and Shu Zhao. "An Unsupervised Rapid Network Alignment Framework via Network Coarsening." Mathematics 11, no. 3 (2023): 573. http://dx.doi.org/10.3390/math11030573.

Full text
Abstract:
Network alignment aims to identify the correspondence of nodes between two or more networks. It is the cornerstone of many network mining tasks, such as cross-platform recommendation and cross-network data aggregation. Recently, with the development of network representation learning techniques, researchers have proposed many embedding-based network alignment methods. The effect is better than traditional methods. However, several issues and challenges remain for network alignment tasks, such as lack of labeled data, mapping across network embedding spaces, and computational efficiency. Based
APA, Harvard, Vancouver, ISO, and other styles
10

Truică, Ciprian-Octavian, Elena-Simona Apostol, Maria-Luiza Șerban, and Adrian Paschke. "Topic-Based Document-Level Sentiment Analysis Using Contextual Cues." Mathematics 9, no. 21 (2021): 2722. http://dx.doi.org/10.3390/math9212722.

Full text
Abstract:
Document-level Sentiment Analysis is a complex task that implies the analysis of large textual content that can incorporate multiple contradictory polarities at the phrase and word levels. Most of the current approaches either represent textual data using pre-trained word embeddings without considering the local context that can be extracted from the dataset, or they detect the overall topic polarity without considering both the local and global context. In this paper, we propose a novel document-topic embedding model, DocTopic2Vec, for document-level polarity detection in large texts by emplo
APA, Harvard, Vancouver, ISO, and other styles
11

Jang, Youngjin, and Harksoo Kim. "Reliable Classification of FAQs with Spelling Errors Using an Encoder-Decoder Neural Network in Korean." Applied Sciences 9, no. 22 (2019): 4758. http://dx.doi.org/10.3390/app9224758.

Full text
Abstract:
To resolve lexical disagreement problems between queries and frequently asked questions (FAQs), we propose a reliable sentence classification model based on an encoder-decoder neural network. The proposed model uses three types of word embeddings; fixed word embeddings for representing domain-independent meanings of words, fined-tuned word embeddings for representing domain-specific meanings of words, and character-level word embeddings for bridging lexical gaps caused by spelling errors. It also uses class embeddings to represent domain knowledge associated with each category. In the experime
APA, Harvard, Vancouver, ISO, and other styles
12

Jin, Zhesheng, and Yunhua Zhang. "A Graph Neural Network-Based Context-Aware Framework for Sentiment Analysis Classification in Chinese Microblogs." Mathematics 13, no. 6 (2025): 997. https://doi.org/10.3390/math13060997.

Full text
Abstract:
Sentiment analysis in Chinese microblogs is challenged by complex syntactic structures and fine-grained sentiment shifts. To address these challenges, a Contextually Enriched Graph Neural Network (CE-GNN) is proposed, integrating self-supervised learning, context-aware sentiment embeddings, and Graph Neural Networks (GNNs) to enhance sentiment classification. First, CE-GNN is pre-trained on a large corpus of unlabeled text through self-supervised learning, where Masked Language Modeling (MLM) and Next Sentence Prediction (NSP) are leveraged to obtain contextualized embeddings. These embeddings
APA, Harvard, Vancouver, ISO, and other styles
13

Guo, Lei, Haoran Jiang, Xiyu Liu, and Changming Xing. "Network Embedding-Aware Point-of-Interest Recommendation in Location-Based Social Networks." Complexity 2019 (November 4, 2019): 1–18. http://dx.doi.org/10.1155/2019/3574194.

Full text
Abstract:
As one of the important techniques to explore unknown places for users, the methods that are proposed for point-of-interest (POI) recommendation have been widely studied in recent years. Compared with traditional recommendation problems, POI recommendations are suffering from more challenges, such as the cold-start and one-class collaborative filtering problems. Many existing studies have focused on how to overcome these challenges by exploiting different types of contexts (e.g., social and geographical information). However, most of these methods only model these contexts as regularization te
APA, Harvard, Vancouver, ISO, and other styles
14

Nguyen, Van Quan, Tien Nguyen Anh, and Hyung-Jeong Yang. "Real-time event detection using recurrent neural network in social sensors." International Journal of Distributed Sensor Networks 15, no. 6 (2019): 155014771985649. http://dx.doi.org/10.1177/1550147719856492.

Full text
Abstract:
We proposed an approach for temporal event detection using deep learning and multi-embedding on a set of text data from social media. First, a convolutional neural network augmented with multiple word-embedding architectures is used as a text classifier for the pre-processing of the input textual data. Second, an event detection model using a recurrent neural network is employed to learn time series data features by extracting temporal information. Recently, convolutional neural networks have been used in natural language processing problems and have obtained excellent results as performing on
APA, Harvard, Vancouver, ISO, and other styles
15

Altuntas, Volkan. "NodeVector: A Novel Network Node Vectorization with Graph Analysis and Deep Learning." Applied Sciences 14, no. 2 (2024): 775. http://dx.doi.org/10.3390/app14020775.

Full text
Abstract:
Network node embedding captures structural and relational information of nodes in the network and allows for us to use machine learning algorithms for various prediction tasks on network data that have an inherently complex and disordered structure. Network node embedding should preserve as much information as possible about important network properties where information is stored, such as network structure and node properties, while representing nodes as numerical vectors in a lower-dimensional space than the original higher dimensional space. Superior node embedding algorithms are a powerful
APA, Harvard, Vancouver, ISO, and other styles
16

Jadon, Anil Kumar, and Suresh Kumar. "Enhancing emotion detection with synergistic combination of word embeddings and convolutional neural networks." Indonesian Journal of Electrical Engineering and Computer Science 35, no. 3 (2024): 1933. http://dx.doi.org/10.11591/ijeecs.v35.i3.pp1933-1941.

Full text
Abstract:
Recognizing emotions in textual data is crucial in a wide range of natural language processing (NLP) applications, from consumer sentiment research to mental health evaluation. The word embedding techniques play a pivotal role in text processing. In this paper, the performance of several well-known word embedding methods is evaluated in the context of emotion recognition. The classification of emotions is further enhanced using a convolutional neural network (CNN) model because of its propensity to capture local patterns and its recent triumphs in text-related tasks. The integration of CNN wit
APA, Harvard, Vancouver, ISO, and other styles
17

Anil, Kumar Jadon Suresh Kumar. "Enhancing emotion detection with synergistic combination of word embeddings and convolutional neural networks." Indonesian Journal of Electrical Engineering and Computer Science 35, no. 3 (2024): 1933–41. https://doi.org/10.11591/ijeecs.v35.i3.pp1933-1941.

Full text
Abstract:
Recognizing emotions in textual data is crucial in a wide range of natural language processing (NLP) applications, from consumer sentiment research to mental health evaluation. The word embedding techniques play a pivotal role in text processing. In this paper, the performance of several well-known word embedding methods is evaluated in the context of emotion recognition. The classification of emotions is further enhanced using a convolutional neural network (CNN) model because of its propensity to capture local patterns and its recent triumphs in text-related tasks. The integration of CNN wit
APA, Harvard, Vancouver, ISO, and other styles
18

Jbene, Mourad, Smail Tigani, Saadane Rachid, and Abdellah Chehri. "Deep Neural Network and Boosting Based Hybrid Quality Ranking for e-Commerce Product Search." Big Data and Cognitive Computing 5, no. 3 (2021): 35. http://dx.doi.org/10.3390/bdcc5030035.

Full text
Abstract:
In the age of information overload, customers are overwhelmed with the number of products available for sale. Search engines try to overcome this issue by filtering relevant items to the users’ queries. Traditional search engines rely on the exact match of terms in the query and product meta-data. Recently, deep learning-based approaches grabbed more attention by outperforming traditional methods in many circumstances. In this work, we involve the power of embeddings to solve the challenging task of optimizing product search engines in e-commerce. This work proposes an e-commerce product searc
APA, Harvard, Vancouver, ISO, and other styles
19

Popov, Alexander. "Neural Network Models for Word Sense Disambiguation: An Overview." Cybernetics and Information Technologies 18, no. 1 (2018): 139–51. http://dx.doi.org/10.2478/cait-2018-0012.

Full text
Abstract:
Abstract The following article presents an overview of the use of artificial neural networks for the task of Word Sense Disambiguation (WSD). More specifically, it surveys the advances in neural language models in recent years that have resulted in methods for the effective distributed representation of linguistic units. Such representations – word embeddings, context embeddings, sense embeddings – can be effectively applied for WSD purposes, as they encode rich semantic information, especially in conjunction with recurrent neural networks, which are able to capture long-distance relations enc
APA, Harvard, Vancouver, ISO, and other styles
20

Hu, Ganglin, and Jun Pang. "Relation-Aware Weighted Embedding for Heterogeneous Graphs." Information Technology and Control 52, no. 1 (2023): 199–214. http://dx.doi.org/10.5755/j01.itc.52.1.32390.

Full text
Abstract:
Heterogeneous graph embedding, aiming to learn the low-dimensional representations of nodes, is effective in many tasks, such as link prediction, node classification, and community detection. Most existing graph embedding methods conducted on heterogeneous graphs treat the heterogeneous neighbours equally. Although it is possible to get node weights through attention mechanisms mainly developed using expensive recursive message-passing, they are difficult to deal with large-scale networks. In this paper, we propose R-WHGE, a relation-aware weighted embedding model for heterogeneous graphs, to
APA, Harvard, Vancouver, ISO, and other styles
21

Bui-Thi, Danh, Emmanuel Rivière, Pieter Meysman, and Kris Laukens. "Predicting compound-protein interaction using hierarchical graph convolutional networks." PLOS ONE 17, no. 7 (2022): e0258628. http://dx.doi.org/10.1371/journal.pone.0258628.

Full text
Abstract:
Motivation Convolutional neural networks have enabled unprecedented breakthroughs in a variety of computer vision tasks. They have also drawn much attention from other domains, including drug discovery and drug development. In this study, we develop a computational method based on convolutional neural networks to tackle a fundamental question in drug discovery and development, i.e. the prediction of compound-protein interactions based on compound structure and protein sequence. We propose a hierarchical graph convolutional network (HGCN) to encode small molecules. The HGCN aggregates a molecul
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Bin, Yu Chen, Jinfang Sheng, and Zhengkun He. "Attributed Graph Embedding Based on Attention with Cluster." Mathematics 10, no. 23 (2022): 4563. http://dx.doi.org/10.3390/math10234563.

Full text
Abstract:
Graph embedding is of great significance for the research and analysis of graphs. Graph embedding aims to map nodes in the network to low-dimensional vectors while preserving information in the original graph of nodes. In recent years, the appearance of graph neural networks has significantly improved the accuracy of graph embedding. However, the influence of clusters was not considered in existing graph neural network (GNN)-based methods, so this paper proposes a new method to incorporate the influence of clusters into the generation of graph embedding. We use the attention mechanism to pass
APA, Harvard, Vancouver, ISO, and other styles
23

Boldakov, V. "Emotional Speech Synthesis with Emotion Embeddings." Herald of the Siberian State University of Telecommunications and Informatics, no. 4 (December 18, 2021): 23–31. http://dx.doi.org/10.55648/1998-6920-2021-15-4-23-31.

Full text
Abstract:
Several neural network architectures provide high-quality speech synthesis. Several neural network architectures provide high-quality speech synthesis. In this article, emotional speech synthesis with global style tokens is researched. A novel method of emotional speech synthesis with emotional text embeddings is described.
APA, Harvard, Vancouver, ISO, and other styles
24

Ota, Kosuke, Keiichiro Shirai, Hidetoshi Miyao, and Minoru Maruyama. "Multimodal Analogy-Based Image Retrieval by Improving Semantic Embeddings." Journal of Advanced Computational Intelligence and Intelligent Informatics 26, no. 6 (2022): 995–1003. http://dx.doi.org/10.20965/jaciii.2022.p0995.

Full text
Abstract:
In this work, we study the application of multimodal analogical reasoning to image retrieval. Multimodal analogy questions are given in a form of tuples of words and images, e.g., “cat”:“dog”::[an image of a cat sitting on a bench]:?, to search for an image of a dog sitting on a bench. Retrieving desired images given these tuples can be seen as a task of finding images whose relation between the query image is close to that of query words. One way to achieve the task is building a common vector space that exhibits analogical regularities. To learn such an embedding, we propose a quadruple neur
APA, Harvard, Vancouver, ISO, and other styles
25

Eyharabide, Victoria, Imad Eddine Ibrahim Bekkouch, and Nicolae Dragoș Constantin. "Knowledge Graph Embedding-Based Domain Adaptation for Musical Instrument Recognition." Computers 10, no. 8 (2021): 94. http://dx.doi.org/10.3390/computers10080094.

Full text
Abstract:
Convolutional neural networks raised the bar for machine learning and artificial intelligence applications, mainly due to the abundance of data and computations. However, there is not always enough data for training, especially when it comes to historical collections of cultural heritage where the original artworks have been destroyed or damaged over time. Transfer Learning and domain adaptation techniques are possible solutions to tackle the issue of data scarcity. This article presents a new method for domain adaptation based on Knowledge graph embeddings. Knowledge Graph embedding forms a p
APA, Harvard, Vancouver, ISO, and other styles
26

Takase, Sho, Jun Suzuki, and Masaaki Nagata. "Character n-Gram Embeddings to Improve RNN Language Models." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5074–82. http://dx.doi.org/10.1609/aaai.v33i01.33015074.

Full text
Abstract:
This paper proposes a novel Recurrent Neural Network (RNN) language model that takes advantage of character information. We focus on character n-grams based on research in the field of word embedding construction (Wieting et al. 2016). Our proposed method constructs word embeddings from character ngram embeddings and combines them with ordinary word embeddings. We demonstrate that the proposed method achieves the best perplexities on the language modeling datasets: Penn Treebank, WikiText-2, and WikiText-103. Moreover, we conduct experiments on application tasks: machine translation and headli
APA, Harvard, Vancouver, ISO, and other styles
27

Ng, Michael K., Hanrui Wu, and Andy Yip. "Stability and Generalization of Hypergraph Collaborative Networks." Machine Intelligence Research 21, no. 1 (2024): 184–96. http://dx.doi.org/10.1007/s11633-022-1397-1.

Full text
Abstract:
AbstractGraph neural networks have been shown to be very effective in utilizing pairwise relationships across samples. Recently, there have been several successful proposals to generalize graph neural networks to hypergraph neural networks to exploit more complex relationships. In particular, the hypergraph collaborative networks yield superior results compared to other hypergraph neural networks for various semi-supervised learning tasks. The collaborative network can provide high quality vertex embeddings and hyperedge embeddings together by formulating them as a joint optimization problem a
APA, Harvard, Vancouver, ISO, and other styles
28

Nguyen, Andre T., Fred Lu, Gary Lopez Munoz, Edward Raff, Charles Nicholas, and James Holt. "Out of Distribution Data Detection Using Dropout Bayesian Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 7 (2022): 7877–85. http://dx.doi.org/10.1609/aaai.v36i7.20757.

Full text
Abstract:
We explore the utility of information contained within a dropout based Bayesian neural network (BNN) for the task of detecting out of distribution (OOD) data. We first show how previous attempts to leverage the randomized embeddings induced by the intermediate layers of a dropout BNN can fail due to the distance metric used. We introduce an alternative approach to measuring embedding uncertainty, and demonstrate how incorporating embedding uncertainty improves OOD data identification across three tasks: image classification, language classification, and malware detection.
APA, Harvard, Vancouver, ISO, and other styles
29

P. Bhopale, Bhopale, and Ashish Tiwari. "LEVERAGING NEURAL NETWORK PHRASE EMBEDDING MODEL FOR QUERY REFORMULATION IN AD-HOC BIOMEDICAL INFORMATION RETRIEVAL." Malaysian Journal of Computer Science 34, no. 2 (2021): 151–70. http://dx.doi.org/10.22452/mjcs.vol34no2.2.

Full text
Abstract:
This study presents a spark enhanced neural network phrase embedding model to leverage query representation for relevant biomedical literature retrieval. Information retrieval for clinical decision support demands high precision. In recent years, word embeddings have been evolved as a solution to such requirements. It represents vocabulary words in low-dimensional vectors in the context of their similar words; however, it is inadequate to deal with semantic phrases or multi-word units. Learning vector embeddings for phrases by maintaining word meanings is a challenging task. This study propose
APA, Harvard, Vancouver, ISO, and other styles
30

Wu, Xueyi, Yuanyuan Xu, Wenjie Zhang, and Ying Zhang. "Billion-Scale Bipartite Graph Embedding: A Global-Local Induced Approach." Proceedings of the VLDB Endowment 17, no. 2 (2023): 175–83. http://dx.doi.org/10.14778/3626292.3626300.

Full text
Abstract:
Bipartite graph embedding (BGE), as the fundamental task in bipartite network analysis, is to map each node to compact low-dimensional vectors that preserve intrinsic properties. The existing solutions towards BGE fall into two groups: metric-based methods and graph neural network-based (GNN-based) methods. The latter typically generates higher-quality embeddings than the former due to the strong representation ability of deep learning. Nevertheless, none of the existing GNN-based methods can handle billion-scale bipartite graphs due to the expensive message passing or complex modelling choice
APA, Harvard, Vancouver, ISO, and other styles
31

Li, Xiaolong, Yang Dong, Yunfei Yi, Zhixun Liang, and Shuqi Yan. "Hypergraph Neural Network for Multimodal Depression Recognition." Electronics 13, no. 22 (2024): 4544. http://dx.doi.org/10.3390/electronics13224544.

Full text
Abstract:
Deep learning-based approaches for automatic depression recognition offer advantages of low cost and high efficiency. However, depression symptoms are challenging to detect and vary significantly between individuals. Traditional deep learning methods often struggle to capture and model these nuanced features effectively, leading to lower recognition accuracy. This paper introduces a novel multimodal depression recognition method, HYNMDR, which utilizes hypergraphs to represent the complex, high-order relationships among patients with depression. HYNMDR comprises two primary components: a tempo
APA, Harvard, Vancouver, ISO, and other styles
32

Alvi, Majdah, Adnan Akhter, Muhammad Bux Alvi, and Noor Fatima. "Emotion Detection and Analysis using Textual Data through Trainable and Pre-trained Word Embedding Methods." VFAST Transactions on Software Engineering 13, no. 2 (2025): 28–43. https://doi.org/10.21015/vtse.v13i2.2115.

Full text
Abstract:
Emotion expression modes play a significant role in human communication. Humans use emotions to convey their state of mind to each other on platforms such as X (formerly Twitter), Facebook, and other online social networks. People often express their emotions using free text, which triggers a vast research area of emotion detection and analysis. This work aims to detect and analyze emotions from unstructured text data. For this purpose, this research study proposes a solution to the problem by building a deep artificial neural network model using trainable and pre-trained word embedding method
APA, Harvard, Vancouver, ISO, and other styles
33

Gao, Yan, Yandong Wang, Patrick Wang, and Lei Gu. "Medical Named Entity Extraction from Chinese Resident Admit Notes Using Character and Word Attention-Enhanced Neural Network." International Journal of Environmental Research and Public Health 17, no. 5 (2020): 1614. http://dx.doi.org/10.3390/ijerph17051614.

Full text
Abstract:
The resident admit notes (RANs) in electronic medical records (EMRs) is first-hand information to study the patient’s condition. Medical entity extraction of RANs is an important task to get disease information for medical decision-making. For Chinese electronic medical records, each medical entity contains not only word information but also rich character information. Effective combination of words and characters is very important for medical entity extraction. We propose a medical entity recognition model based on a character and word attention-enhanced (CWAE) neural network for Chinese RANs
APA, Harvard, Vancouver, ISO, and other styles
34

Hagad, Juan Lorenzo, Tsukasa Kimura, Ken-ichi Fukui, and Masayuki Numao. "Learning Subject-Generalized Topographical EEG Embeddings Using Deep Variational Autoencoders and Domain-Adversarial Regularization." Sensors 21, no. 5 (2021): 1792. http://dx.doi.org/10.3390/s21051792.

Full text
Abstract:
Two of the biggest challenges in building models for detecting emotions from electroencephalography (EEG) devices are the relatively small amount of labeled samples and the strong variability of signal feature distributions between different subjects. In this study, we propose a context-generalized model that tackles the data constraints and subject variability simultaneously using a deep neural network architecture optimized for normally distributed subject-independent feature embeddings. Variational autoencoders (VAEs) at the input level allow the lower feature layers of the model to be trai
APA, Harvard, Vancouver, ISO, and other styles
35

Delianidi, Marina, and Konstantinos Diamantaras. "KT-Bi-GRU: Student Performance Prediction with a Bi-Directional Recurrent Knowledge Tracing Neural Network." Journal of Educational Data Mining 15, no. 2 (2023): 1–21. https://doi.org/10.5281/zenodo.7808087.

Full text
Abstract:
Student performance is affected by their knowledge which changes dynamically over time. Therefore, employing recurrent neural networks (RNN), which are known to be very good in dynamic time series prediction, can be a suitable approach for student performance prediction. We propose such a neural network architecture containing two modules: (i) a dynamic sub-network including a recurrent Bi-GRU layer used for knowledge state estimation, (ii) a non-dynamic, feed-forward sub-network for predicting answer correctness based on the current question and current student knowledge state. The model modi
APA, Harvard, Vancouver, ISO, and other styles
36

Li, Wenli, and Gang Wu. "One-shot Based Knowledge Graph Embedded Neural Architecture Search Algorithm." Frontiers in Computing and Intelligent Systems 3, no. 3 (2023): 1–5. http://dx.doi.org/10.54097/fcis.v3i3.7982.

Full text
Abstract:
The quality of embeddings is crucial for downstream tasks in knowledge graphs. Researchers usually introduce neural network architecture search into knowledge graph embedding for machine automatic construction of appropriate neural networks for each dataset. An existing approach is to divide the search space into macro search space and micro search space. The search strategy for micro space is based on one-shot weight sharing strategy, but it will lead to all the information obtained from the previous supernet training is discarded and the advantages of one-shot algorithm are not fully utilize
APA, Harvard, Vancouver, ISO, and other styles
37

Hu, Jia Cheng, Roberto Cavicchioli, and Alessandro Capotondi. "Embeddings hidden layers learning for neural network compression." Neural Networks 191 (November 2025): 107794. https://doi.org/10.1016/j.neunet.2025.107794.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Peng, Hao, Qing Ke, Ceren Budak, Daniel M. Romero, and Yong-Yeol Ahn. "Neural embeddings of scholarly periodicals reveal complex disciplinary organizations." Science Advances 7, no. 17 (2021): eabb9004. http://dx.doi.org/10.1126/sciadv.abb9004.

Full text
Abstract:
Understanding the structure of knowledge domains is one of the foundational challenges in the science of science. Here, we propose a neural embedding technique that leverages the information contained in the citation network to obtain continuous vector representations of scientific periodicals. We demonstrate that our periodical embeddings encode nuanced relationships between periodicals and the complex disciplinary and interdisciplinary structure of science, allowing us to make cross-disciplinary analogies between periodicals. Furthermore, we show that the embeddings capture meaningful “axes”
APA, Harvard, Vancouver, ISO, and other styles
39

Zhang, Kainan, Zhipeng Cai, and Daehee Seo. "Privacy-Preserving Federated Graph Neural Network Learning on Non-IID Graph Data." Wireless Communications and Mobile Computing 2023 (February 3, 2023): 1–13. http://dx.doi.org/10.1155/2023/8545101.

Full text
Abstract:
Since the concept of federated learning (FL) was proposed by Google in 2017, many applications have been combined with FL technology due to its outstanding performance in data integration, computing performance, privacy protection, etc. However, most traditional federated learning-based applications focus on image processing and natural language processing with few achievements in graph neural networks due to the graph’s nonindependent identically distributed (IID) nature. Representation learning on graph-structured data generates graph embedding, which helps machines understand graphs effecti
APA, Harvard, Vancouver, ISO, and other styles
40

Kim, Harang, and Hyun Min Song. "Lightweight IDS Framework Using Word Embeddings for In-Vehicle Network Security." Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications 15, no. 2 (2022): 1–13. http://dx.doi.org/10.58346/jowua.2024.i2.001.

Full text
Abstract:
As modern vehicle systems evolve into advanced cyber-physical systems, vehicle vulnerability to cyber threats has significantly increased. This paper discusses the need for advanced security in the Controller Area Network (CAN), which currently lacks security features. We propose a novel Intrusion Detection System (IDS) utilizing word embedding techniques from Natural Language Processing (NLP) for effective sequential pattern representations to improve intrusion detection in CAN traffic. This method transforms CAN identifiers into multi-dimensional vectors, enabling the model to capture comple
APA, Harvard, Vancouver, ISO, and other styles
41

Özkaya Eren, Ayşegül, and Mustafa Sert. "Audio Captioning with Composition of Acoustic and Semantic Information." International Journal of Semantic Computing 15, no. 02 (2021): 143–60. http://dx.doi.org/10.1142/s1793351x21400018.

Full text
Abstract:
Generating audio captions is a new research area that combines audio and natural language processing to create meaningful textual descriptions for audio clips. To address this problem, previous studies mostly use the encoder–decoder-based models without considering semantic information. To fill this gap, we present a novel encoder–decoder architecture using bi-directional Gated Recurrent Units (BiGRU) with audio and semantic embeddings. We extract semantic embedding by obtaining subjects and verbs from the audio clip captions and combine these embedding with audio embedding to feed the BiGRU-b
APA, Harvard, Vancouver, ISO, and other styles
42

Ye, Yutong, Xiang Lian, and Mingsong Chen. "Efficient Exact Subgraph Matching via GNN-Based Path Dominance Embedding." Proceedings of the VLDB Endowment 17, no. 7 (2024): 1628–41. http://dx.doi.org/10.14778/3654621.3654630.

Full text
Abstract:
The classic problem of exact subgraph matching returns those subgraphs in a large-scale data graph that are isomorphic to a given query graph, which has gained increasing importance in many real-world applications such as social network analysis, knowledge graph discovery in the Semantic Web, bibliographical network mining, and so on. In this paper, we propose a novel and effective graph neural network (GNN)-based path embedding framework (GNN-PE), which allows efficient exact subgraph matching without introducing false dismissals. Unlike traditional GNN-based graph embeddings that only produc
APA, Harvard, Vancouver, ISO, and other styles
43

Tzougas, George, and Konstantin Kutzkov. "Enhancing Logistic Regression Using Neural Networks for Classification in Actuarial Learning." Algorithms 16, no. 2 (2023): 99. http://dx.doi.org/10.3390/a16020099.

Full text
Abstract:
We developed a methodology for the neural network boosting of logistic regression aimed at learning an additional model structure from the data. In particular, we constructed two classes of neural network-based models: shallow–dense neural networks with one hidden layer and deep neural networks with multiple hidden layers. Furthermore, several advanced approaches were explored, including the combined actuarial neural network approach, embeddings and transfer learning. The model training was achieved by minimizing either the deviance or the cross-entropy loss functions, leading to fourteen neur
APA, Harvard, Vancouver, ISO, and other styles
44

Croce, Danilo, Daniele Rossini, and Roberto Basili. "Neural embeddings: accurate and readable inferences based on semantic kernels." Natural Language Engineering 25, no. 4 (2019): 519–41. http://dx.doi.org/10.1017/s1351324919000238.

Full text
Abstract:
AbstractSentence embeddings are the suitable input vectors for the neural learning of a number of inferences about content and meaning. Similarity estimation, classification, emotional characterization of sentences as well as pragmatic tasks, such as question answering or dialogue, have largely demonstrated the effectiveness of vector embeddings to model semantics. Unfortunately, most of the above decisions are epistemologically opaque as for the limited interpretability of the acquired neural models based on the involved embeddings. We think that any effective approach to meaning representati
APA, Harvard, Vancouver, ISO, and other styles
45

Zhou, Silin, Jing Li, Hao Wang, Shuo Shang, and Peng Han. "GRLSTM: Trajectory Similarity Computation with Graph-Based Residual LSTM." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 4 (2023): 4972–80. http://dx.doi.org/10.1609/aaai.v37i4.25624.

Full text
Abstract:
The computation of trajectory similarity is a crucial task in many spatial data analysis applications. However, existing methods have been designed primarily for trajectories in Euclidean space, which overlooks the fact that real-world trajectories are often generated on road networks. This paper addresses this gap by proposing a novel framework, called GRLSTM (Graph-based Residual LSTM). To jointly capture the properties of trajectories and road networks, the proposed framework incorporates knowledge graph embedding (KGE), graph neural network (GNN), and the residual network into the multi-la
APA, Harvard, Vancouver, ISO, and other styles
46

Zaiter, Louai. "Towards Breast Cancer Diagnosis Using Multiple Mammography Views." Technium: Romanian Journal of Applied Sciences and Technology 29 (April 25, 2025): 43–47. https://doi.org/10.47577/technium.v29i.12739.

Full text
Abstract:
This study introduces a novel computer aided diagnosis system to diagnose breast cancer using two mammography views as input i.e. MLO and CC. The pipeline consists of a convolutional autoencoder that is trained to extract features from different mammograms’ views, and one-dimensional convolutional neural nework to classify the input embeddings into two classes i.e. benign or malignant. We compare the one-dimensional convolutional neural network classification results with a support vector machine trained on the same latent embeddings. We conclude that the combination of autoencoders and one-di
APA, Harvard, Vancouver, ISO, and other styles
47

Chang, Zhihao, Linzhu Yu, Yanchao Xu, and Wentao Hu. "Neural Embeddings for kNN Search in Biological Sequence." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 1 (2024): 38–45. http://dx.doi.org/10.1609/aaai.v38i1.27753.

Full text
Abstract:
Biological sequence nearest neighbor search plays a fundamental role in bioinformatics. To alleviate the pain of quadratic complexity for conventional distance computation, neural distance embeddings, which project sequences into geometric space, have been recognized as a promising paradigm. To maintain the distance order between sequences, these models all deploy triplet loss and use intuitive methods to select a subset of triplets for training from a vast selection space. However, we observed that such training often enables models to distinguish only a fraction of distance orders, leaving o
APA, Harvard, Vancouver, ISO, and other styles
48

Xu, You-Wei, Hong-Jun Zhang, Kai Cheng, Xiang-Lin Liao, Zi-Xuan Zhang, and Yun-Bo Li. "Knowledge graph embedding with entity attributes using hypergraph neural networks." Intelligent Data Analysis 26, no. 4 (2022): 959–75. http://dx.doi.org/10.3233/ida-216007.

Full text
Abstract:
Knowledge graph embedding is aimed at capturing the semantic information of entities by modeling the structural information between entities. For long-tail entities which lack sufficient structural information, general knowledge graph embedding models often show relatively low performance in link prediction. In order to solve such problems, this paper proposes a general knowledge graph embedding framework to learn the structural information as well as the attribute information of the entities simultaneously. Under this framework, a H-AKRL (Hypergraph Neural Networks based Attribute-embodied Kn
APA, Harvard, Vancouver, ISO, and other styles
49

E., Koshel. "Нейронно-мережевий підхід до неперервного вкладення одновимірних потоків даних для аналізу часових рядів в реальному часі". System technologies 2, № 151 (2024): 92–101. http://dx.doi.org/10.34185/1562-9945-2-151-2024-08.

Full text
Abstract:
Univariate time series analysis is a universal problem that arises in various science and engineering fields and the approaches and methods developed around this problem are diverse and numerous. These methods, however, often require the univariate data stream to be transformed into a sequence of higher-dimensional vectors (embeddings). In this article, we explore the existing embedding methods, examine their capabilities to perform in real-time, and propose a new approach that couples the classical methods with the neural network-based ones to yield results that are better in both accuracy an
APA, Harvard, Vancouver, ISO, and other styles
50

Zhong, Fengzhe, Yan Liu, Lian Liu, Guangsheng Zhang, and Shunran Duan. "DEDGCN: Dual Evolving Dynamic Graph Convolutional Network." Security and Communication Networks 2022 (May 10, 2022): 1–11. http://dx.doi.org/10.1155/2022/6945397.

Full text
Abstract:
With the wide application of graph data in many fields, the research of graph representation learning technology has become the focus of scholars’ attention. Especially, dynamic graph representation learning is an important part of solving the problem of change graph in reality. On the one hand, most dynamic graph representation methods focus either on graph structure changes or node embedding changes, ignoring the internal relationship. On the other hand, most dynamic graph neural networks require learn node embeddings from specific tasks, resulting in poor universality of node embeddings and
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!