Academic literature on the topic 'Graph attention network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Graph attention network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Graph attention network"

1

Wu, Nan, and Chaofan Wang. "Ensemble Graph Attention Networks." Transactions on Machine Learning and Artificial Intelligence 10, no. 3 (2022): 29–41. http://dx.doi.org/10.14738/tmlai.103.12399.

Full text
Abstract:
Graph neural networks have demonstrated its success in many applications on graph-structured data. Many efforts have been devoted to elaborating new network architectures and learning algorithms over the past decade. The exploration of applying ensemble learning techniques to enhance existing graph algorithms have been overlooked. In this work, we propose a simple generic bagging-based ensemble learning strategy which is applicable to any backbone graph models. We then propose two ensemble graph neural network models – Ensemble-GAT and Ensemble-HetGAT by applying the ensemble strategy to the g
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Bin, Yu Chen, Jinfang Sheng, and Zhengkun He. "Attributed Graph Embedding Based on Attention with Cluster." Mathematics 10, no. 23 (2022): 4563. http://dx.doi.org/10.3390/math10234563.

Full text
Abstract:
Graph embedding is of great significance for the research and analysis of graphs. Graph embedding aims to map nodes in the network to low-dimensional vectors while preserving information in the original graph of nodes. In recent years, the appearance of graph neural networks has significantly improved the accuracy of graph embedding. However, the influence of clusters was not considered in existing graph neural network (GNN)-based methods, so this paper proposes a new method to incorporate the influence of clusters into the generation of graph embedding. We use the attention mechanism to pass
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Yu, Yuan Tian, Jiawei Zhang, and Yi Chang. "Learning Signed Network Embedding via Graph Attention." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 4772–79. http://dx.doi.org/10.1609/aaai.v34i04.5911.

Full text
Abstract:
Learning the low-dimensional representations of graphs (i.e., network embedding) plays a critical role in network analysis and facilitates many downstream tasks. Recently graph convolutional networks (GCNs) have revolutionized the field of network embedding, and led to state-of-the-art performance in network analysis tasks such as link prediction and node classification. Nevertheless, most of the existing GCN-based network embedding methods are proposed for unsigned networks. However, in the real world, some of the networks are signed, where the links are annotated with different polarities, e
APA, Harvard, Vancouver, ISO, and other styles
4

Murzin, M. V., I. A. Kulikov, and N. A. Zhukova. "Methods for Constructing Graph Neural Networks." LETI Transactions on Electrical Engineering & Computer Science 17, no. 10 (2024): 40–48. https://doi.org/10.32603/2071-8985-2024-17-10-40-48.

Full text
Abstract:
Discusses an approach to classifying graph neural networks in terms of basic concepts. In addition, the fundamentals of convolutional graph neural networks, Graph attentional neural networks, recurrent graph neural networks, graph automatic encoders, and spatial-temporal graph neural networks are discussed. On the example of Cora dataset, a comparison of neural network models presented in TensorFlow, PyTorch libraries, as well as the model of graph neural network of attention for the task of classification of nodes of the knowledge graph, is carried out. The efficiency of using graph attention
APA, Harvard, Vancouver, ISO, and other styles
5

Sheng, Jinfang, Yufeng Zhang, Bin Wang, and Yaoxing Chang. "MGATs: Motif-Based Graph Attention Networks." Mathematics 12, no. 2 (2024): 293. http://dx.doi.org/10.3390/math12020293.

Full text
Abstract:
In recent years, graph convolutional neural networks (GCNs) have become a popular research topic due to their outstanding performance in various complex network data mining tasks. However, current research on graph neural networks lacks understanding of the high-order structural features of networks, focusing mostly on node features and first-order neighbor features. This article proposes two new models, MGAT and MGATv2, by introducing high-order structure motifs that frequently appear in networks and combining them with graph attention mechanisms. By introducing a mixed information matrix bas
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Wanru, Yuwei Lv, Yonggang Wen, and Xuemei Sun. "Rumor Detection Based on Knowledge Enhancement and Graph Attention Network." Discrete Dynamics in Nature and Society 2022 (October 6, 2022): 1–12. http://dx.doi.org/10.1155/2022/6257658.

Full text
Abstract:
Presently, most of the existing rumor detection methods focus on learning and integrating various features for detection, but due to the complexity of the language, these models often rarely consider the relationship between the parts of speech. For the first time, this paper integrated a knowledge graphs and graph attention networks to solve this problem through attention mechanisms. A knowledge graphs can be the most effective and intuitive expression of relationships between entities, providing problem analysis from the perspective of “relationships”. This paper used knowledge graphs to enh
APA, Harvard, Vancouver, ISO, and other styles
7

Han, Wenhao, Xuemei Liu, Jianhao Zhang, and Hairui Li. "Hierarchical Perceptual Graph Attention Network for Knowledge Graph Completion." Electronics 13, no. 4 (2024): 721. http://dx.doi.org/10.3390/electronics13040721.

Full text
Abstract:
Knowledge graph completion (KGC), the process of predicting missing knowledge through known triples, is a primary focus of research in the field of knowledge graphs. As an important graph representation technique in deep learning, graph neural networks (GNNs) perform well in knowledge graph completion, but most existing graph neural network-based knowledge graph completion methods tend to aggregate neighborhood information directly and individually, ignoring the rich hierarchical semantic structure of KGs. As a result, how to effectively deal with multi-level complex relations is still not wel
APA, Harvard, Vancouver, ISO, and other styles
8

Zhou, Anzhong, and Yifen Li. "Structural attention network for graph." Applied Intelligence 51, no. 8 (2021): 6255–64. http://dx.doi.org/10.1007/s10489-021-02214-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

He, Liancheng, Liang Bai, Xian Yang, Hangyuan Du, and Jiye Liang. "High-order graph attention network." Information Sciences 630 (June 2023): 222–34. http://dx.doi.org/10.1016/j.ins.2023.02.054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Qiu, Jing, Feng Dong, and Guanglu Sun. "Disassemble Byte Sequence Using Graph Attention Network." JUCS - Journal of Universal Computer Science 28, no. 7 (2022): 758–75. http://dx.doi.org/10.3897/jucs.76528.

Full text
Abstract:
Disassembly is the basis of static analysis of binary code and is used in malicious code detection, vulnerability mining, software optimization, etc. Disassembly of arbitrary suspicious code blocks (e.g., for suspicious traffic packets intercepted by the network) is a difficult task. Traditional disassembly methods require manual specification of the starting address and cannot automate the disassembly of arbitrary code blocks. In this paper, we propose a disassembly method based on code extension selection network by combining traditional linear sweep and recursive traversal methods. First, e
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Graph attention network"

1

Lee, John Boaz T. "Deep Learning on Graph-structured Data." Digital WPI, 2019. https://digitalcommons.wpi.edu/etd-dissertations/570.

Full text
Abstract:
In recent years, deep learning has made a significant impact in various fields – helping to push the state-of-the-art forward in many application domains. Convolutional Neural Networks (CNN) have been applied successfully to tasks such as visual object detection, image super-resolution, and video action recognition while Long Short-term Memory (LSTM) and Transformer networks have been used to solve a variety of challenging tasks in natural language processing. However, these popular deep learning architectures (i.e., CNNs, LSTMs, and Transformers) can only handle data that can be represented a
APA, Harvard, Vancouver, ISO, and other styles
2

You, Di. "Attributed Multi-Relational Attention Network for Fact-checking URL Recommendation." Digital WPI, 2019. https://digitalcommons.wpi.edu/etd-theses/1321.

Full text
Abstract:
To combat fake news, researchers mostly focused on detecting fake news and journalists built and maintained fact-checking sites (e.g., Snopes.com and Politifact.com). However, fake news dissemination has been greatly promoted by social media sites, and these fact-checking sites have not been fully utilized. To overcome these problems and complement existing methods against fake news, in this thesis, we propose a deep-learning based fact-checking URL recommender system to mitigate impact of fake news in social media sites such as Twitter and Facebook. In particular, our proposed framework consi
APA, Harvard, Vancouver, ISO, and other styles
3

Dronzeková, Michaela. "Analýza polygonálních modelů pomocí neuronových sítí." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2020. http://www.nusl.cz/ntk/nusl-417253.

Full text
Abstract:
This thesis deals with rotation estimation of 3D model of human jaw. It describes and compares methods for direct analysis od 3D models as well as method to analyze model using rasterization. To evaluate perfomance of proposed method, a metric that computes number of cases when prediction was less than 30° from ground truth is used. Proposed method that uses rasterization, takes  three x-ray views of model as an input and processes it with convolutional network. It achieves best preformance, 99% with described metric. Method to directly analyze polygonal model as a sequence uses attention mech
APA, Harvard, Vancouver, ISO, and other styles
4

Blini, Elvio A. "Biases in Visuo-Spatial Attention: from Assessment to Experimental Induction." Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424480.

Full text
Abstract:
In this work I present several studies, which might appear rather heterogeneous for both experimental questions and methodological approaches, and yet are linked by a common leitmotiv: spatial attention. I will address issues related to the assessment of attentional asymmetries, in the healthy individual as in patients with neurological disorders, their role in various aspects of human cognition, and their neural underpinning, driven by the deep belief that spatial attention plays an important role in various mental processes that are not necessarily confined to perception. What follows is or
APA, Harvard, Vancouver, ISO, and other styles
5

Guo, Dalu. "Attention Networks in Visual Question Answering and Visual Dialog." Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/25079.

Full text
Abstract:
Attention is a substantial mechanism for human to process massive data. It omits the trivial parts and focuses on the important ones. For example, we only need to remember the keywords in a long sentence and the principal objects in an image for rebuilding the sources. Therefore, it is crucial to building an attention network for artificial intelligence to solve the problem as human. This mechanism has been fully explored in the text-based tasks, such as language translation, reading comprehension, and sentimental analysis, as well as the visual-based tasks, such as image recognition, object d
APA, Harvard, Vancouver, ISO, and other styles
6

Chatzianastasis, Michail. "Advancements in Graph Representation Learning and Applications in Computational Biology." Electronic Thesis or Diss., Institut polytechnique de Paris, 2025. http://www.theses.fr/2025IPPAX017.

Full text
Abstract:
Les graphes constituent un cadre naturel et flexible pour modéliser les relations et interactions entre entités dans divers domaines, tels que les réseaux sociaux, la chimie et la biologie. Cela a suscité un vif intérêt pour l’apprentissage des représentations de graphes, qui vise à extraire des représentations informatives et à réaliser des prédictions sur des données structurées en graphes. Cependant, la nature intrinsèquement non euclidienne des graphes pose d’importants défis aux modèles d’apprentissage automatique classiques. Les réseaux de neurones pour graphes (GNN) relèvent ces défis e
APA, Harvard, Vancouver, ISO, and other styles
7

Belhadj, Djedjiga. "Multi-GAT semi-supervisé pour l’extraction d’informations et son adaptation au chiffrement homomorphe." Electronic Thesis or Diss., Université de Lorraine, 2024. http://www.theses.fr/2024LORR0023.

Full text
Abstract:
Cette thèse est réalisée dans le cadre du projet BPI DeepTech, en collaboration avec la société Fair&Smart, veillant principalement à la protection des données personnelles conformément au Règlement Général sur la Protection des Données (RGPD). Dans ce contexte, nous avons proposé un modèle neuronal profond pour l'extraction d'informations dans les documents administratifs semi-structurés (DSSs). En raison du manque de données d'entraînement publiques, nous avons proposé un générateur artificiel de DSSs qui peut générer plusieurs classes de documents avec une large variation de contenu et
APA, Harvard, Vancouver, ISO, and other styles
8

Gullstrand, Mattias, and Stefan Maraš. "Using Graph Neural Networks for Track Classification and Time Determination of Primary Vertices in the ATLAS Experiment." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-288505.

Full text
Abstract:
Starting in 2027, the high-luminosity Large Hadron Collider (HL-LHC) will begin operation and allow higher-precision measurements and searches for new physics processes between elementary particles. One central problem that arises in the ATLAS detector when reconstructing event information is to separate the rare and interesting hard scatter (HS) interactions from uninteresting pileup (PU) interactions in a spatially compact environment. This problem becomes even harder to solve at higher luminosities. This project relies on leveraging the time dimension and determining a time of the HS intera
APA, Harvard, Vancouver, ISO, and other styles
9

Breckel, Thomas P. K. [Verfasser], Christiane [Akademischer Betreuer] Thiel, and Stefan [Akademischer Betreuer] Debener. "Insights into brain networks from functional MRI and graph analysis during and following attentional demand / Thomas P. K. Breckel. Betreuer: Christiane Thiel ; Stefan Debener." Oldenburg : BIS der Universität Oldenburg, 2013. http://d-nb.info/1050299434/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Breckel, Thomas [Verfasser], Christiane Akademischer Betreuer] Thiel, and Stefan [Akademischer Betreuer] [Debener. "Insights into brain networks from functional MRI and graph analysis during and following attentional demand / Thomas P. K. Breckel. Betreuer: Christiane Thiel ; Stefan Debener." Oldenburg : BIS der Universität Oldenburg, 2013. http://nbn-resolving.de/urn:nbn:de:gbv:715-oops-15262.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Graph attention network"

1

Bianconi, Ginestra. Synchronization, Non-linear Dynamics and Control. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198753919.003.0015.

Full text
Abstract:
This chapter is entirely devoted to characterizing non-linear dynamics on multilayer networks. Special attention is given to recent results on the stability of synchronization that extend the Master Stability Function approach to the multilayer networks scenario. Discontinous synchronization transitions on multiplex networks recently reported in the literature are also discussed, and their application discussed in the context of brain networks. This chapter also presents an overview of the major results regarding pattern formation in multilayer networks, and the proposed characterization of mu
APA, Harvard, Vancouver, ISO, and other styles
2

Dorogovtsev, Sergey N., and José F. F. Mendes. The Nature of Complex Networks. Oxford University PressOxford, 2022. http://dx.doi.org/10.1093/oso/9780199695119.001.0001.

Full text
Abstract:
Abstract The researchers studying complex networks will acquire from this advanced modern book a number of new issues and ideas, not yet touched upon in other reference volumes. The book considers a wide range of networks and processes taking place on them, paying particular attention to the recently developed directions, methods, and techniques. It proposes a statistical mechanics view of random networks based on the concept of statistical ensembles, but approaches and methods of modern graph theory, concerning random graphs, overlap strongly with statistical physics. Hence mathematicians hav
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Graph attention network"

1

Zhang, Xueya, Tong Zhang, Wenting Zhao, Zhen Cui, and Jian Yang. "Dual-Attention Graph Convolutional Network." In Lecture Notes in Computer Science. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41299-9_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Teufel, Jonas, Luca Torresi, Patrick Reiser, and Pascal Friederich. "MEGAN: Multi-explanation Graph Attention Network." In Communications in Computer and Information Science. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-44067-0_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Long, Yunfei, Huosheng Xu, Pengyuan Qi, Liguo Zhang, and Jun Li. "Graph Attention Network for Word Embeddings." In Lecture Notes in Computer Science. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78612-0_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Ziming, Jun Chen, and Haopeng Chen. "EGAT: Edge-Featured Graph Attention Network." In Lecture Notes in Computer Science. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-86362-3_21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gaudel, Bijay, Donghai Guan, Weiwei Yuan, Deepanjal Shrestha, Bing Chen, and Yaofeng Tu. "Graph Representation Learning Using Attention Network." In Big Data. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-0705-9_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Hui, Peng Zhou, and Junbo Ma. "Path Integration Enhanced Graph Attention Network." In Advanced Data Mining and Applications. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-46674-8_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gong, Haoran, Zhuojun An, Jialong Mou, Jianjun Cheng, and Li Liu. "KAGAT: Kolmogorov-Arnold Graph Attention Network." In Communications in Computer and Information Science. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-9946-9_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Guo, Yanli, and Zhongmin Yan. "Collaborative Filtering: Graph Neural Network with Attention." In Web Information Systems and Applications. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60029-7_39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhu, Xingwei, Pengpeng Zhao, Jiajie Xu, et al. "Knowledge Graph Attention Network Enhanced Sequential Recommendation." In Web and Big Data. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60259-8_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Shaohua, Weimin Li, Alex Munyole Luvembe, and Weiqin Tong. "Graph Contrastive ATtention Network for Rumor Detection." In Communications in Computer and Information Science. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-8178-6_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Graph attention network"

1

Nur Pawestri, Syifa, Hasmawati, and Said Al Faraby. "Question Classification Using Graph Attention Network." In 2024 International Conference on Data Science and Its Applications (ICoDSA). IEEE, 2024. http://dx.doi.org/10.1109/icodsa62899.2024.10652095.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jia, Kai, and Yangping Wang. "Knowledge Graph Completion Method Based on Graph Contrastive attention Network." In 2024 5th International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI). IEEE, 2024. https://doi.org/10.1109/ichci63580.2024.10807984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, Chenting, Xin Li, Sai Lv, Qingsong Li, and Bin Kang. "Multi-level Aggregation Heterogeneous Graph Attention Network." In 2024 43rd Chinese Control Conference (CCC). IEEE, 2024. http://dx.doi.org/10.23919/ccc63176.2024.10661928.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kieu, Hai-Dang. "Graph Attention Network for Motor Imagery Classification." In 2024 RIVF International Conference on Computing and Communication Technologies (RIVF). IEEE, 2024. https://doi.org/10.1109/rivf64335.2024.11009062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Zhongqiang, Fanyang Meng, Ye Wang, Chen Mao, Qi Qiu, and Guangming Shi. "Hyperspectral Image Classification with Graph Attention Network." In 2025 6th International Conference on Geology, Mapping and Remote Sensing (ICGMRS). IEEE, 2025. https://doi.org/10.1109/icgmrs66001.2025.11065111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fadia, Love, Vatsal Shah, Mohammad Hassanzadeh, Majid Ahmadi, and Jonathan Wu. "Graph Attention Network and Graph Convolutional Network for Classification of Dengue Virus Variants." In 2025 3rd International Conference on Advancement in Computation & Computer Technologies (InCACCT). IEEE, 2025. https://doi.org/10.1109/incacct65424.2025.11011441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Yong, Jie Yu, Yang Yu, and Zexing Liu. "Knowledge Graph Completion for Industrial Robots Based on Graph Attention Network." In 2024 China Automation Congress (CAC). IEEE, 2024. https://doi.org/10.1109/cac63892.2024.10864989.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Xu, Feifei, Yumeng Zhang, and Yifei Li. "Graph Neural Network Algorithm Based on Graph Convolution and Attention Mechanism." In 2025 IEEE 5th International Conference on Power, Electronics and Computer Applications (ICPECA). IEEE, 2025. https://doi.org/10.1109/icpeca63937.2025.10928730.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Tianchao. "Heterogeneous Graph Embedding Based on Graph Neural Network and Attention Mechanism." In 2024 10th International Conference on Systems and Informatics (ICSAI). IEEE, 2024. https://doi.org/10.1109/icsai65059.2024.10893791.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Congrui, Li Li, and Fei Hao. "Deep Attention Fusion Network for Attributed Graph Clustering." In 2024 International Conference on Networking and Network Applications (NaNA). IEEE, 2024. http://dx.doi.org/10.1109/nana63151.2024.00070.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Graph attention network"

1

Fait, Aaron, Grant Cramer, and Avichai Perl. Towards improved grape nutrition and defense: The regulation of stilbene metabolism under drought. United States Department of Agriculture, 2014. http://dx.doi.org/10.32747/2014.7594398.bard.

Full text
Abstract:
The goals of the present research proposal were to elucidate the physiological and molecular basis of the regulation of stilbene metabolism in grape, against the background of (i) grape metabolic network behavior in response to drought and of (ii) varietal diversity. The specific objectives included the study of the physiology of the response of different grape cultivars to continuous WD; the characterization of the differences and commonalities of gene network topology associated with WD in berry skin across varieties; the study of the metabolic response of developing berries to continuous WD
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!