To see the other types of publications on this topic, follow the link: Graph attention network.

Journal articles on the topic 'Graph attention network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Graph attention network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wu, Nan, and Chaofan Wang. "Ensemble Graph Attention Networks." Transactions on Machine Learning and Artificial Intelligence 10, no. 3 (2022): 29–41. http://dx.doi.org/10.14738/tmlai.103.12399.

Full text
Abstract:
Graph neural networks have demonstrated its success in many applications on graph-structured data. Many efforts have been devoted to elaborating new network architectures and learning algorithms over the past decade. The exploration of applying ensemble learning techniques to enhance existing graph algorithms have been overlooked. In this work, we propose a simple generic bagging-based ensemble learning strategy which is applicable to any backbone graph models. We then propose two ensemble graph neural network models – Ensemble-GAT and Ensemble-HetGAT by applying the ensemble strategy to the g
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Bin, Yu Chen, Jinfang Sheng, and Zhengkun He. "Attributed Graph Embedding Based on Attention with Cluster." Mathematics 10, no. 23 (2022): 4563. http://dx.doi.org/10.3390/math10234563.

Full text
Abstract:
Graph embedding is of great significance for the research and analysis of graphs. Graph embedding aims to map nodes in the network to low-dimensional vectors while preserving information in the original graph of nodes. In recent years, the appearance of graph neural networks has significantly improved the accuracy of graph embedding. However, the influence of clusters was not considered in existing graph neural network (GNN)-based methods, so this paper proposes a new method to incorporate the influence of clusters into the generation of graph embedding. We use the attention mechanism to pass
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Yu, Yuan Tian, Jiawei Zhang, and Yi Chang. "Learning Signed Network Embedding via Graph Attention." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 4772–79. http://dx.doi.org/10.1609/aaai.v34i04.5911.

Full text
Abstract:
Learning the low-dimensional representations of graphs (i.e., network embedding) plays a critical role in network analysis and facilitates many downstream tasks. Recently graph convolutional networks (GCNs) have revolutionized the field of network embedding, and led to state-of-the-art performance in network analysis tasks such as link prediction and node classification. Nevertheless, most of the existing GCN-based network embedding methods are proposed for unsigned networks. However, in the real world, some of the networks are signed, where the links are annotated with different polarities, e
APA, Harvard, Vancouver, ISO, and other styles
4

Murzin, M. V., I. A. Kulikov, and N. A. Zhukova. "Methods for Constructing Graph Neural Networks." LETI Transactions on Electrical Engineering & Computer Science 17, no. 10 (2024): 40–48. https://doi.org/10.32603/2071-8985-2024-17-10-40-48.

Full text
Abstract:
Discusses an approach to classifying graph neural networks in terms of basic concepts. In addition, the fundamentals of convolutional graph neural networks, Graph attentional neural networks, recurrent graph neural networks, graph automatic encoders, and spatial-temporal graph neural networks are discussed. On the example of Cora dataset, a comparison of neural network models presented in TensorFlow, PyTorch libraries, as well as the model of graph neural network of attention for the task of classification of nodes of the knowledge graph, is carried out. The efficiency of using graph attention
APA, Harvard, Vancouver, ISO, and other styles
5

Sheng, Jinfang, Yufeng Zhang, Bin Wang, and Yaoxing Chang. "MGATs: Motif-Based Graph Attention Networks." Mathematics 12, no. 2 (2024): 293. http://dx.doi.org/10.3390/math12020293.

Full text
Abstract:
In recent years, graph convolutional neural networks (GCNs) have become a popular research topic due to their outstanding performance in various complex network data mining tasks. However, current research on graph neural networks lacks understanding of the high-order structural features of networks, focusing mostly on node features and first-order neighbor features. This article proposes two new models, MGAT and MGATv2, by introducing high-order structure motifs that frequently appear in networks and combining them with graph attention mechanisms. By introducing a mixed information matrix bas
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Wanru, Yuwei Lv, Yonggang Wen, and Xuemei Sun. "Rumor Detection Based on Knowledge Enhancement and Graph Attention Network." Discrete Dynamics in Nature and Society 2022 (October 6, 2022): 1–12. http://dx.doi.org/10.1155/2022/6257658.

Full text
Abstract:
Presently, most of the existing rumor detection methods focus on learning and integrating various features for detection, but due to the complexity of the language, these models often rarely consider the relationship between the parts of speech. For the first time, this paper integrated a knowledge graphs and graph attention networks to solve this problem through attention mechanisms. A knowledge graphs can be the most effective and intuitive expression of relationships between entities, providing problem analysis from the perspective of “relationships”. This paper used knowledge graphs to enh
APA, Harvard, Vancouver, ISO, and other styles
7

Han, Wenhao, Xuemei Liu, Jianhao Zhang, and Hairui Li. "Hierarchical Perceptual Graph Attention Network for Knowledge Graph Completion." Electronics 13, no. 4 (2024): 721. http://dx.doi.org/10.3390/electronics13040721.

Full text
Abstract:
Knowledge graph completion (KGC), the process of predicting missing knowledge through known triples, is a primary focus of research in the field of knowledge graphs. As an important graph representation technique in deep learning, graph neural networks (GNNs) perform well in knowledge graph completion, but most existing graph neural network-based knowledge graph completion methods tend to aggregate neighborhood information directly and individually, ignoring the rich hierarchical semantic structure of KGs. As a result, how to effectively deal with multi-level complex relations is still not wel
APA, Harvard, Vancouver, ISO, and other styles
8

Zhou, Anzhong, and Yifen Li. "Structural attention network for graph." Applied Intelligence 51, no. 8 (2021): 6255–64. http://dx.doi.org/10.1007/s10489-021-02214-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

He, Liancheng, Liang Bai, Xian Yang, Hangyuan Du, and Jiye Liang. "High-order graph attention network." Information Sciences 630 (June 2023): 222–34. http://dx.doi.org/10.1016/j.ins.2023.02.054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Qiu, Jing, Feng Dong, and Guanglu Sun. "Disassemble Byte Sequence Using Graph Attention Network." JUCS - Journal of Universal Computer Science 28, no. 7 (2022): 758–75. http://dx.doi.org/10.3897/jucs.76528.

Full text
Abstract:
Disassembly is the basis of static analysis of binary code and is used in malicious code detection, vulnerability mining, software optimization, etc. Disassembly of arbitrary suspicious code blocks (e.g., for suspicious traffic packets intercepted by the network) is a difficult task. Traditional disassembly methods require manual specification of the starting address and cannot automate the disassembly of arbitrary code blocks. In this paper, we propose a disassembly method based on code extension selection network by combining traditional linear sweep and recursive traversal methods. First, e
APA, Harvard, Vancouver, ISO, and other styles
11

Qiu, Jing, Feng Dong, and Guanglu Sun. "Disassemble Byte Sequence Using Graph Attention Network." JUCS - Journal of Universal Computer Science 28, no. (7) (2022): 758–75. https://doi.org/10.3897/jucs.76528.

Full text
Abstract:
Disassembly is the basis of static analysis of binary code and is used in malicious code detection, vulnerability mining, software optimization, etc. Disassembly of arbitrary suspicious code blocks (e.g., for suspicious traffic packets intercepted by the network) is a difficult task. Traditional disassembly methods require manual specification of the starting address and cannot automate the disassembly of arbitrary code blocks. In this paper, we propose a disassembly method based on code extension selection network by combining traditional linear sweep and recursive traversal methods. First, e
APA, Harvard, Vancouver, ISO, and other styles
12

Liu, Jie, Lingyun Song, Li Gao, and Xuequn Shang. "MMAN: Metapath Based Multi-Level Graph Attention Networks for Heterogeneous Network Embedding (Student Abstract)." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 11 (2022): 13005–6. http://dx.doi.org/10.1609/aaai.v36i11.21639.

Full text
Abstract:
Current Heterogeneous Network Embedding (HNE) models can be roughly divided into two types, i.e., relation-aware and metapath-aware models. However, they either fail to represent the non-pairwise relations in heterogeneous graph, or only capable of capturing local information around target node. In this paper, we propose a metapath based multilevel graph attention networks (MMAN) to jointly learn node embeddings on two substructures, i.e., metapath based graphs and hypergraphs extracted from original heterogeneous graph. Extensive experiments on three benchmark datasets for node classification
APA, Harvard, Vancouver, ISO, and other styles
13

Li, Zitong, Xiang Cheng, Lixiao Sun, Ji Zhang, and Bing Chen. "A Hierarchical Approach for Advanced Persistent Threat Detection with Attention-Based Graph Neural Networks." Security and Communication Networks 2021 (May 4, 2021): 1–14. http://dx.doi.org/10.1155/2021/9961342.

Full text
Abstract:
Advanced Persistent Threats (APTs) are the most sophisticated attacks for modern information systems. Currently, more and more researchers begin to focus on graph-based anomaly detection methods that leverage graph data to model normal behaviors and detect outliers for defending against APTs. However, previous studies of provenance graphs mainly concentrate on system calls, leading to difficulties in modeling network behaviors. Coarse-grained correlation graphs depend on handcrafted graph construction rules and, thus, cannot adequately explore log node attributes. Besides, the traditional Grap
APA, Harvard, Vancouver, ISO, and other styles
14

Chen, Lu, Boer Lv, Chi Wang, Su Zhu, Bowen Tan, and Kai Yu. "Schema-Guided Multi-Domain Dialogue State Tracking with Graph Attention Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 7521–28. http://dx.doi.org/10.1609/aaai.v34i05.6250.

Full text
Abstract:
Dialogue state tracking (DST) aims at estimating the current dialogue state given all the preceding conversation. For multi-domain DST, the data sparsity problem is also a major obstacle due to the increased number of state candidates. Existing approaches generally predict the value for each slot independently and do not consider slot relations, which may aggravate the data sparsity problem. In this paper, we propose a Schema-guided multi-domain dialogue State Tracker with graph attention networks (SST) that predicts dialogue states from dialogue utterances and schema graphs which contain slot
APA, Harvard, Vancouver, ISO, and other styles
15

Wu, Zheng, Hongchang Chen, Jianpeng Zhang, Yulong Pei, and Zishuo Huang. "Temporal motif-based attentional graph convolutional network for dynamic link prediction." Intelligent Data Analysis 27, no. 1 (2023): 241–68. http://dx.doi.org/10.3233/ida-216169.

Full text
Abstract:
Dynamic link prediction is an important component of the dynamic network analysis with many real-world applications. Currently, most advancements focus on analyzing link-defined neighborhoods with graph convolutional networks (GCN), while ignoring the influence of higher-order structural and temporal interacting features on link formation. Therefore, based on recent progress in modeling temporal graphs, we propose a novel temporal motif-based attentional graph convolutional network model (TMAGCN) for dynamic link prediction. As dynamic graphs usually contain periodical patterns, we first propo
APA, Harvard, Vancouver, ISO, and other styles
16

Cai, Zengyu, Chunchen Tan, Jianwei Zhang, Liang Zhu, and Yuan Feng. "DBSTGNN-Att: Dual Branch Spatio-Temporal Graph Neural Network with an Attention Mechanism for Cellular Network Traffic Prediction." Applied Sciences 14, no. 5 (2024): 2173. http://dx.doi.org/10.3390/app14052173.

Full text
Abstract:
As network technology continues to develop, the popularity of various intelligent terminals has accelerated, leading to a rapid growth in the scale of wireless network traffic. This growth has resulted in significant pressure on resource consumption and network security maintenance. The objective of this paper is to enhance the prediction accuracy of cellular network traffic in order to provide reliable support for the subsequent base station sleep control or the identification of malicious traffic. To achieve this target, a cellular network traffic prediction method based on multi-modal data
APA, Harvard, Vancouver, ISO, and other styles
17

Zhang, H., J. J. Zhou, and R. Li. "Enhanced Unsupervised Graph Embedding via Hierarchical Graph Convolution Network." Mathematical Problems in Engineering 2020 (July 26, 2020): 1–9. http://dx.doi.org/10.1155/2020/5702519.

Full text
Abstract:
Graph embedding aims to learn the low-dimensional representation of nodes in the network, which has been paid more and more attention in many graph-based tasks recently. Graph Convolution Network (GCN) is a typical deep semisupervised graph embedding model, which can acquire node representation from the complex network. However, GCN usually needs to use a lot of labeled data and additional expressive features in the graph embedding learning process, so the model cannot be effectively applied to undirected graphs with only network structure information. In this paper, we propose a novel unsuper
APA, Harvard, Vancouver, ISO, and other styles
18

Guo, Ruiqiang, Juan Zou, Qianqian Bai, Wei Wang, and Xiaomeng Chang. "Community Detection Fusing Graph Attention Network." Mathematics 10, no. 21 (2022): 4155. http://dx.doi.org/10.3390/math10214155.

Full text
Abstract:
It has become a tendency to use a combination of autoencoders and graph neural networks for attribute graph clustering to solve the community detection problem. However, the existing methods do not consider the influence differences between node neighborhood information and high-order neighborhood information, and the fusion of structural and attribute features is insufficient. In order to make better use of structural information and attribute information, we propose a model named community detection fusing graph attention network (CDFG). Specifically, we firstly use an autoencoder to learn a
APA, Harvard, Vancouver, ISO, and other styles
19

Hsu, Howard Muchen, Zai-Fu Yao, Kai Hwang, and Shulan Hsieh. "Between-module functional connectivity of the salient ventral attention network and dorsal attention network is associated with motor inhibition." PLOS ONE 15, no. 12 (2020): e0242985. http://dx.doi.org/10.1371/journal.pone.0242985.

Full text
Abstract:
The ability to inhibit motor response is crucial for daily activities. However, whether brain networks connecting spatially distinct brain regions can explain individual differences in motor inhibition is not known. Therefore, we took a graph-theoretic perspective to examine the relationship between the properties of topological organization in functional brain networks and motor inhibition. We analyzed data from 141 healthy adults aged 20 to 78, who underwent resting-state functional magnetic resonance imaging and performed a stop-signal task along with neuropsychological assessments outside
APA, Harvard, Vancouver, ISO, and other styles
20

Lu, Zhilong, Weifeng Lv, Zhipu Xie, et al. "Graph Sequence Neural Network with an Attention Mechanism for Traffic Speed Prediction." ACM Transactions on Intelligent Systems and Technology 13, no. 2 (2022): 1–24. http://dx.doi.org/10.1145/3470889.

Full text
Abstract:
Recent years have witnessed the emerging success of Graph Neural Networks (GNNs) for modeling graphical data. A GNN can model the spatial dependencies of nodes in a graph based on message passing through node aggregation. However, in many application scenarios, these spatial dependencies can change over time, and a basic GNN model cannot capture these changes. In this article, we propose a G raph S eq uence neural network with an A tt ention mechanism (GSeqAtt) for processing graph sequences. More specifically, two attention mechanisms are combined: a horizontal mechanism and a vertical mechan
APA, Harvard, Vancouver, ISO, and other styles
21

Shi, Jianrun, Leiyang Cui, Bo Gu, Bin Lyu, and Shimin Gong. "State Transition Graph-Based Spatial–Temporal Attention Network for Cell-Level Mobile Traffic Prediction." Sensors 23, no. 23 (2023): 9308. http://dx.doi.org/10.3390/s23239308.

Full text
Abstract:
Mobile traffic prediction enables the efficient utilization of network resources and enhances user experience. In this paper, we propose a state transition graph-based spatial–temporal attention network (STG-STAN) for cell-level mobile traffic prediction, which is designed to exploit the underlying spatial–temporal dynamic information hidden in the historical mobile traffic data. Specifically, we first identify the semantic context information over different segments of the historical data by constructing the state transition graphs, which may reveal different patterns of random fluctuation. T
APA, Harvard, Vancouver, ISO, and other styles
22

Shang, Bin, Yinliang Zhao, Jun Liu, and Di Wang. "Mixed Geometry Message and Trainable Convolutional Attention Network for Knowledge Graph Completion." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 8 (2024): 8966–74. http://dx.doi.org/10.1609/aaai.v38i8.28745.

Full text
Abstract:
Knowledge graph completion (KGC) aims to study the embedding representation to solve the incompleteness of knowledge graphs (KGs). Recently, graph convolutional networks (GCNs) and graph attention networks (GATs) have been widely used in KGC tasks by capturing neighbor information of entities. However, Both GCNs and GATs based KGC models have their limitations, and the best method is to analyze the neighbors of each entity (pre-validating), while this process is prohibitively expensive. Furthermore, the representation quality of the embeddings can affect the aggregation of neighbor information
APA, Harvard, Vancouver, ISO, and other styles
23

Gu, Yafeng, and Li Deng. "STAGCN: Spatial–Temporal Attention Graph Convolution Network for Traffic Forecasting." Mathematics 10, no. 9 (2022): 1599. http://dx.doi.org/10.3390/math10091599.

Full text
Abstract:
Traffic forecasting plays an important role in intelligent transportation systems. However, the prediction task is highly challenging due to the mixture of global and local spatiotemporal dependencies involved in traffic data. Existing graph neural networks (GNNs) typically capture spatial dependencies with the predefined or learnable static graph structure, ignoring the hidden dynamic patterns in traffic networks. Meanwhile, most recurrent neural networks (RNNs) or convolutional neural networks (CNNs) cannot effectively capture temporal correlations, especially for long-term temporal dependen
APA, Harvard, Vancouver, ISO, and other styles
24

Yang, Xiaowen, Yanghui Wen, Shichao Jiao, Rong Zhao, Xie Han, and Ligang He. "Point Cloud Segmentation Network Based on Attention Mechanism and Dual Graph Convolution." Electronics 12, no. 24 (2023): 4991. http://dx.doi.org/10.3390/electronics12244991.

Full text
Abstract:
To overcome the limitations of inadequate local feature representation and the underutilization of global information in dynamic graph convolutions, we propose a network that combines attention mechanisms with dual graph convolutions. Firstly, we construct a static graph based on the dynamic graph using the K-nearest neighbors algorithm and geometric distances of point clouds. This integration of dynamic and static graphs forms a dual graph structure, compensating for the underutilization of geometric positional relationships in the dynamic graph. Next, edge convolutions are applied to extract
APA, Harvard, Vancouver, ISO, and other styles
25

Zi, Wenjie, Wei Xiong, Hao Chen, Jun Li, and Ning Jing. "SGA-Net: Self-Constructing Graph Attention Neural Network for Semantic Segmentation of Remote Sensing Images." Remote Sensing 13, no. 21 (2021): 4201. http://dx.doi.org/10.3390/rs13214201.

Full text
Abstract:
Semantic segmentation of remote sensing images is always a critical and challenging task. Graph neural networks, which can capture global contextual representations, can exploit long-range pixel dependency, thereby improving semantic segmentation performance. In this paper, a novel self-constructing graph attention neural network is proposed for such a purpose. Firstly, ResNet50 was employed as backbone of a feature extraction network to acquire feature maps of remote sensing images. Secondly, pixel-wise dependency graphs were constructed from the feature maps of images, and a graph attention
APA, Harvard, Vancouver, ISO, and other styles
26

Bae, Ji-Hun, Gwang-Hyun Yu, Ju-Hwan Lee, et al. "Superpixel Image Classification with Graph Convolutional Neural Networks Based on Learnable Positional Embedding." Applied Sciences 12, no. 18 (2022): 9176. http://dx.doi.org/10.3390/app12189176.

Full text
Abstract:
Graph convolutional neural networks (GCNNs) have been successfully applied to a wide range of problems, including low-dimensional Euclidean structural domains representing images, videos, and speech and high-dimensional non-Euclidean domains, such as social networks and chemical molecular structures. However, in computer vision, the existing GCNNs are not provided with positional information to distinguish between graphs of new structures; therefore, the performance of the image classification domain represented by arbitrary graphs is significantly poor. In this work, we introduce how to initi
APA, Harvard, Vancouver, ISO, and other styles
27

Yang, Xiaocheng, Mingyu Yan, Shirui Pan, Xiaochun Ye, and Dongrui Fan. "Simple and Efficient Heterogeneous Graph Neural Network." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (2023): 10816–24. http://dx.doi.org/10.1609/aaai.v37i9.26283.

Full text
Abstract:
Heterogeneous graph neural networks (HGNNs) have the powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations. Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) designed for homogeneous graphs, especially the attention mechanism and the multi-layer structure. These mechanisms bring excessive complexity, but seldom work studies whether they are really effective on heterogeneous graphs. In this paper, we conduct an in-depth and detailed study of these mechanisms and propose the Simple and Efficient Heterogeneo
APA, Harvard, Vancouver, ISO, and other styles
28

Wenjuan Xiao, Wenjuan Xiao, and Xiaoming Wang Wenjuan Xiao. "Attention Mechanism Based Spatial-Temporal Graph Convolution Network for Traffic Prediction." 電腦學刊 35, no. 4 (2024): 093–108. http://dx.doi.org/10.53106/199115992024083504007.

Full text
Abstract:
<p>Considering the complexity of traffic systems and the challenges brought by various factors in traffic prediction, we propose a spatial-temporal graph convolutional neural network based on attention mechanism (AMSTGCN) to adapt to these dynamic changes and improve prediction accuracy. The model combines the spatial feature extraction capability of graph attention network (GAT) and the dynamic correlation learning capability of attention mechanism. By introducing the attention mechanism, the network can adaptively focus on the dependencies between different time steps and different nod
APA, Harvard, Vancouver, ISO, and other styles
29

Chen, Yong, Xiao-Zhu Xie, Wei Weng, and Yi-Fan He. "Multi-Order-Content-Based Adaptive Graph Attention Network for Graph Node Classification." Symmetry 15, no. 5 (2023): 1036. http://dx.doi.org/10.3390/sym15051036.

Full text
Abstract:
In graph-structured data, the node content contains rich information. Therefore, how to effectively utilize the content is crucial to improve the performance of graph convolutional networks (GCNs) on various analytical tasks. However, current GCNs do not fully utilize the content, especially multi-order content. For example, graph attention networks (GATs) only focus on low-order content, while high-order content is completely ignored. To address this issue, we propose a novel graph attention network with adaptability that could fully utilize the features of multi-order content. Its core idea
APA, Harvard, Vancouver, ISO, and other styles
30

Qiushi, Sun, He Yang, and Ovanes Petrosian. "Graph Attention Network Enhanced Power Allocation for Wireless Cellular System." Informatics and Automation 23, no. 1 (2024): 259–83. http://dx.doi.org/10.15622/ia.23.1.9.

Full text
Abstract:
The importance of an efficient network resource allocation strategy has grown significantly with the rapid advancement of cellular network technology and the widespread use of mobile devices. Efficient resource allocation is crucial for enhancing user services and optimizing network performance. The primary objective is to optimize the power distribution method to maximize the total aggregate rate for all customers within the network. In recent years, graph-based deep learning approaches have shown great promise in addressing the challenge of network resource allocation. Graph neural networks
APA, Harvard, Vancouver, ISO, and other styles
31

Catal, Cagatay, Hakan Gunduz, and Alper Ozcan. "Malware Detection Based on Graph Attention Networks for Intelligent Transportation Systems." Electronics 10, no. 20 (2021): 2534. http://dx.doi.org/10.3390/electronics10202534.

Full text
Abstract:
Intelligent Transportation Systems (ITS) aim to make transportation smarter, safer, reliable, and environmentally friendly without detrimentally affecting the service quality. ITS can face security issues due to their complex, dynamic, and non-linear properties. One of the most critical security problems is attacks that damage the infrastructure of the entire ITS. Attackers can inject malware code that triggers dangerous actions such as information theft and unwanted system moves. The main objective of this study is to improve the performance of malware detection models using Graph Attention N
APA, Harvard, Vancouver, ISO, and other styles
32

Lan, Hong, and Qinyi Liu. "Image generation from scene graph with graph attention network." Journal of Image and Graphics 25, no. 8 (2020): 1591–603. http://dx.doi.org/10.11834/jig.190515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Zhou, Xianchen, Yaoyun Zeng, Zepeng Hao, and Hongxia Wang. "A robust graph attention network with dynamic adjusted graph." Engineering Applications of Artificial Intelligence 129 (March 2024): 107619. http://dx.doi.org/10.1016/j.engappai.2023.107619.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Zhang, Guoxing, Haixiao Wang, and Yuanpu Yin. "Multi-type Parameter Prediction of Traffic Flow Based on Time-space Attention Graph Convolutional Network." International Journal of Circuits, Systems and Signal Processing 15 (August 11, 2021): 902–12. http://dx.doi.org/10.46300/9106.2021.15.97.

Full text
Abstract:
Graph Convolutional Neural Networks are more and more widely used in traffic flow parameter prediction tasks by virtue of their excellent non-Euclidean spatial feature extraction capabilities. However, most graph convolutional neural networks are only used to predict one type of traffic flow parameter. This means that the proposed graph convolutional neural network may only be effective for specific parameters of specific travel modes. In order to improve the universality of graph convolutional neural networks. By embedding time feature and spatio-temporal attention layer, we propose a spatio-
APA, Harvard, Vancouver, ISO, and other styles
35

DAI, MEIFENG, JIE ZHU, FANG HUANG, YIN LI, LINHE ZHU, and WEIYI SU. "COHERENCE ANALYSIS FOR ITERATED LINE GRAPHS OF MULTI-SUBDIVISION GRAPH." Fractals 28, no. 04 (2020): 2050067. http://dx.doi.org/10.1142/s0218348x2050067x.

Full text
Abstract:
More and more attention has focused on consensus problem in the study of complex networks. Many researchers investigated consensus dynamics in a linear dynamical system with additive stochastic disturbances. In this paper, we construct iterated line graphs of multi-subdivision graph by applying multi-subdivided-line graph operation. It has been proven that the network coherence can be characterized by the Laplacian spectrum of network. We study the recursion formula of Laplacian eigenvalues of the graphs. After that, we obtain the scalings of the first- and second-order network coherence.
APA, Harvard, Vancouver, ISO, and other styles
36

Cui, Wanqiu, Junping Du, Dawei Wang, Feifei Kou, and Zhe Xue. "MVGAN: Multi-View Graph Attention Network for Social Event Detection." ACM Transactions on Intelligent Systems and Technology 12, no. 3 (2021): 1–24. http://dx.doi.org/10.1145/3447270.

Full text
Abstract:
Social networks are critical sources for event detection thanks to the characteristics of publicity and dissemination. Unfortunately, the randomness and semantic sparsity of the social network text bring significant challenges to the event detection task. In addition to text, time is another vital element in reflecting events since events are often followed for a while. Therefore, in this article, we propose a novel method named Multi-View Graph Attention Network (MVGAN) for event detection in social networks. It enriches event semantics through both neighbor aggregation and multi-view fusion
APA, Harvard, Vancouver, ISO, and other styles
37

Chen, Zhao. "Graph Adaptive Attention Network with Cross-Entropy." Entropy 26, no. 7 (2024): 576. http://dx.doi.org/10.3390/e26070576.

Full text
Abstract:
Non-Euclidean data, such as social networks and citation relationships between documents, have node and structural information. The Graph Convolutional Network (GCN) can automatically learn node features and association information between nodes. The core ideology of the Graph Convolutional Network is to aggregate node information by using edge information, thereby generating a new node feature. In updating node features, there are two core influencing factors. One is the number of neighboring nodes of the central node; the other is the contribution of the neighboring nodes to the central node
APA, Harvard, Vancouver, ISO, and other styles
38

Diao, Qi, Yaping Dai, Jiacheng Wang, Xiaoxue Feng, Feng Pan, and Ce Zhang. "Spatial-Pooling-Based Graph Attention U-Net for Hyperspectral Image Classification." Remote Sensing 16, no. 6 (2024): 937. http://dx.doi.org/10.3390/rs16060937.

Full text
Abstract:
In recent years, graph convolutional networks (GCNs) have attracted increasing attention in hyperspectral image (HSI) classification owing to their exceptional representation capabilities. However, the high computational requirements of GCNs have led most existing GCN-based HSI classification methods to utilize superpixels as graph nodes, thereby limiting the spatial topology scale and neglecting pixel-level spectral–spatial features. To address these limitations, we propose a novel HSI classification network based on graph convolution called the spatial-pooling-based graph attention U-net (SP
APA, Harvard, Vancouver, ISO, and other styles
39

Wang, Ruiheng, Hongliang Zhu, Lu Wang, Zhaoyun Chen, Mingcheng Gao, and Yang Xin. "User Identity Linkage Across Social Networks by Heterogeneous Graph Attention Network Modeling." Applied Sciences 10, no. 16 (2020): 5478. http://dx.doi.org/10.3390/app10165478.

Full text
Abstract:
Today, social networks are becoming increasingly popular and indispensable, where users usually have multiple accounts. It is of considerable significance to conduct user identity linkage across social networks. We can comprehensively depict diversified characteristics of user behaviors, accurately model user profiles, conduct recommendations across social networks, and track cross social network user behaviors by user identity linkage. Existing works mainly focus on a specific type of user profile, user-generated content, and structural information. They have problems of weak data expression
APA, Harvard, Vancouver, ISO, and other styles
40

Zhang, Dehai, Xiaobo Yang, Linan Liu, and Qing Liu. "A Knowledge Graph-Enhanced Attention Aggregation Network for Making Recommendations." Applied Sciences 11, no. 21 (2021): 10432. http://dx.doi.org/10.3390/app112110432.

Full text
Abstract:
In recent years, many researchers have devoted time to designing algorithms used to introduce external information from knowledge graphs, to solve the problems of data sparseness and the cold start, and thus improve the performance of recommendation systems. Inspired by these studies, we proposed KANR, a knowledge graph-enhanced attention aggregation network for making recommendations. This is an end-to-end deep learning model using knowledge graph embedding to enhance the attention aggregation network for making recommendations. It consists of three main parts. The first is the attention aggr
APA, Harvard, Vancouver, ISO, and other styles
41

Wang, Yang, Lifeng Yin, Xiaolong Wang, Guanghai Zheng, and Wu Deng. "A Novel Two-Channel Classification Approach Using Graph Attention Network with K-Nearest Neighbor." Electronics 13, no. 20 (2024): 3985. http://dx.doi.org/10.3390/electronics13203985.

Full text
Abstract:
Graph neural networks (GNNs) typically exhibit superior performance in shallow architectures. However, as the network depth increases, issues such as overfitting and oversmoothing of hidden vector representations arise, significantly diminishing model performance. To address these challenges, this paper proposes a Two-Channel Classification Algorithm Based on Graph Attention Network (TCC_GAT). Initially, nodes exhibiting similar interaction behaviors are identified through cosine similarity, thereby enhancing the foundational graph structure. Subsequently, an attention mechanism is employed to
APA, Harvard, Vancouver, ISO, and other styles
42

Han, Xiaotian, Kaixiong Zhou, Ting-Hsiang Wang, Jundong Li, Fei Wang, and Na Zou. "Marginal Nodes Matter: Towards Structure Fairness in Graphs." ACM SIGKDD Explorations Newsletter 25, no. 2 (2024): 4–13. http://dx.doi.org/10.1145/3655103.3655105.

Full text
Abstract:
In social network, a person located at the periphery region (marginal node) is likely to be treated unfairly when compared with the persons at the center. While existing fairness works on graphs mainly focus on protecting sensitive attributes (e.g., age and gender), the fairness incurred by the graph structure should also be given attention. On the other hand, the information aggregation mechanism of graph neural networks amplifies such structure unfairness, as marginal nodes are often far away from other nodes. In this paper, we focus on novel fairness incurred by the graph structure on graph
APA, Harvard, Vancouver, ISO, and other styles
43

Wang, Yuzhuo, Hongzhi Wang, Wenbo Lu, and Yu Yan. "HyGGE: Hyperbolic graph attention network for reasoning over knowledge graphs." Information Sciences 630 (June 2023): 190–205. http://dx.doi.org/10.1016/j.ins.2023.02.050.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

He, Silu, Qinyao Luo, Xinsha Fu, Ling Zhao, Ronghua Du, and Haifeng Li. "CAT: A causal graph attention network for trimming heterophilic graphs." Information Sciences 677 (August 2024): 120916. http://dx.doi.org/10.1016/j.ins.2024.120916.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Chatzianastasis, Michail, Johannes Lutzeyer, George Dasoulas, and Michalis Vazirgiannis. "Graph Ordering Attention Networks." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 6 (2023): 7006–14. http://dx.doi.org/10.1609/aaai.v37i6.25856.

Full text
Abstract:
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data, achieving state-of-the-art performance. GNNs typically employ a message-passing scheme, in which every node aggregates information from its neighbors using a permutation-invariant aggregation function. Standard well-examined choices such as the mean or sum aggregation functions have limited capabilities, as they are not able to capture interactions among neighbors. In this work, we formalize these interactions using an information-theoretic framework that notably includes synergistic info
APA, Harvard, Vancouver, ISO, and other styles
46

Li, Jinke, Jiahui Hu, Yue Wu, and Xiaoyan Yang. "Pre-Routing Slack Prediction Based on Graph Attention Network." Automation 6, no. 2 (2025): 20. https://doi.org/10.3390/automation6020020.

Full text
Abstract:
Static Timing Analysis (STA) plays a crucial role in realizing timing convergence of integrated circuits. In recent years, there has been growing research on pre-routing timing prediction using Graph Neural Networks (GNNs). However, existing approaches struggle with scalability on large graphs and lack generalizability to new designs, limiting their applicability to large-scale, complex circuit problems. To address this issue, this paper proposes a timing engine based on Graph Attention Network (GAT) to predict the slack of timing endpoints. Firstly, our model computes net embeddings for each
APA, Harvard, Vancouver, ISO, and other styles
47

Boronina, Anna, Vladimir Maksimenko, and Alexander E. Hramov. "Convolutional Neural Network Outperforms Graph Neural Network on the Spatially Variant Graph Data." Mathematics 11, no. 11 (2023): 2515. http://dx.doi.org/10.3390/math11112515.

Full text
Abstract:
Applying machine learning algorithms to graph-structured data has garnered significant attention in recent years due to the prevalence of inherent graph structures in real-life datasets. However, the direct application of traditional deep learning algorithms, such as Convolutional Neural Networks (CNNs), is limited as they are designed for regular Euclidean data like 2D grids and 1D sequences. In contrast, graph-structured data are in a non-Euclidean form. Graph Neural Networks (GNNs) are specifically designed to handle non-Euclidean data and make predictions based on connectivity rather than
APA, Harvard, Vancouver, ISO, and other styles
48

Sun, Li, Zhongbao Zhang, Junda Ye, et al. "A Self-Supervised Mixed-Curvature Graph Neural Network." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 4 (2022): 4146–55. http://dx.doi.org/10.1609/aaai.v36i4.20333.

Full text
Abstract:
Graph representation learning received increasing attentions in recent years. Most of the existing methods ignore the complexity of the graph structures and restrict graphs in a single constant-curvature representation space, which is only suitable to particular kinds of graph structure indeed. Additionally, these methods follow the supervised or semi-supervised learning paradigm, and thereby notably limit their deployment on the unlabeled graphs in real applications. To address these aforementioned limitations, we take the first attempt to study the self-supervised graph representation learni
APA, Harvard, Vancouver, ISO, and other styles
49

Mheich, Ahmad, Fabrice Wendling, and Mahmoud Hassan. "Brain network similarity: methods and applications." Network Neuroscience 4, no. 3 (2020): 507–27. http://dx.doi.org/10.1162/netn_a_00133.

Full text
Abstract:
Graph theoretical approach has proved an effective tool to understand, characterize, and quantify the complex brain network. However, much less attention has been paid to methods that quantitatively compare two graphs, a crucial issue in the context of brain networks. Comparing brain networks is indeed mandatory in several network neuroscience applications. Here, we discuss the current state of the art, challenges, and a collection of analysis tools that have been developed in recent years to compare brain networks. We first introduce the graph similarity problem in brain network application.
APA, Harvard, Vancouver, ISO, and other styles
50

Imani, Maryam, and Daniele Cerra. "Triple Graph Convolutional Network for Hyperspectral Image Feature Fusion and Classification." Remote Sensing 17, no. 9 (2025): 1623. https://doi.org/10.3390/rs17091623.

Full text
Abstract:
Most graph-based networks utilize superpixel generation methods as a preprocessing step, considering superpixels as graph nodes. In the case of hyperspectral images having high variability in spectral features, considering an image region as a graph node may degrade the class discrimination ability of networks for pixel-based classification. Moreover, most graph-based networks focus on global feature extraction, while both local and global information are important for pixel-based classification. To deal with these challenges, superpixel-based graphs are overruled in this work, and a Graph-bas
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!