Academic literature on the topic 'Graph embedding framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Graph embedding framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Graph embedding framework"

1

Liang, Jiongqian, Saket Gurukar, and Srinivasan Parthasarathy. "MILE: A Multi-Level Framework for Scalable Graph Embedding." Proceedings of the International AAAI Conference on Web and Social Media 15 (May 22, 2021): 361–72. http://dx.doi.org/10.1609/icwsm.v15i1.18067.

Full text
Abstract:
Recently there has been a surge of interest in designing graph embedding methods. Few, if any, can scale to a large-sized graph with millions of nodes due to both computational complexity and memory requirements. In this paper, we relax this limitation by introducing the MultI-Level Embedding (MILE) framework – a generic methodology allowing contemporary graph embedding methods to scale to large graphs. MILE repeatedly coarsens the graph into smaller ones using a hybrid matching technique to maintain the backbone structure of the graph. It then applies existing embedding methods on the coarses
APA, Harvard, Vancouver, ISO, and other styles
2

Zhou, Houquan, Shenghua Liu, Danai Koutra, Huawei Shen, and Xueqi Cheng. "A Provable Framework of Learning Graph Embeddings via Summarization." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 4 (2023): 4946–53. http://dx.doi.org/10.1609/aaai.v37i4.25621.

Full text
Abstract:
Given a large graph, can we learn its node embeddings from a smaller summary graph? What is the relationship between embeddings learned from original graphs and their summary graphs? Graph representation learning plays an important role in many graph mining applications, but learning em-beddings of large-scale graphs remains a challenge. Recent works try to alleviate it via graph summarization, which typ-ically includes the three steps: reducing the graph size by combining nodes and edges into supernodes and superedges,learning the supernode embedding on the summary graph and then restoring th
APA, Harvard, Vancouver, ISO, and other styles
3

Duong, Chi Thang, Trung Dung Hoang, Hongzhi Yin, Matthias Weidlich, Quoc Viet Hung Nguyen, and Karl Aberer. "Scalable robust graph embedding with Spark." Proceedings of the VLDB Endowment 15, no. 4 (2021): 914–22. http://dx.doi.org/10.14778/3503585.3503599.

Full text
Abstract:
Graph embedding aims at learning a vector-based representation of vertices that incorporates the structure of the graph. This representation then enables inference of graph properties. Existing graph embedding techniques, however, do not scale well to large graphs. While several techniques to scale graph embedding using compute clusters have been proposed, they require continuous communication between the compute nodes and cannot handle node failure. We therefore propose a framework for scalable and robust graph embedding based on the MapReduce model, which can distribute any existing embeddin
APA, Harvard, Vancouver, ISO, and other styles
4

Fang, Peng, Arijit Khan, Siqiang Luo, et al. "Distributed Graph Embedding with Information-Oriented Random Walks." Proceedings of the VLDB Endowment 16, no. 7 (2023): 1643–56. http://dx.doi.org/10.14778/3587136.3587140.

Full text
Abstract:
Graph embedding maps graph nodes to low-dimensional vectors, and is widely adopted in machine learning tasks. The increasing availability of billion-edge graphs underscores the importance of learning efficient and effective embeddings on large graphs, such as link prediction on Twitter with over one billion edges. Most existing graph embedding methods fall short of reaching high data scalability. In this paper, we present a general-purpose, distributed, information-centric random walk-based graph embedding framework, DistGER, which can scale to embed billion-edge graphs. DistGER incrementally
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Tong, Yifei Wang, Long Sha, Jan Engelbrecht, and Pengyu Hong. "Knowledgebra: An Algebraic Learning Framework for Knowledge Graph." Machine Learning and Knowledge Extraction 4, no. 2 (2022): 432–45. http://dx.doi.org/10.3390/make4020019.

Full text
Abstract:
Knowledge graph (KG) representation learning aims to encode entities and relations into dense continuous vector spaces such that knowledge contained in a dataset could be consistently represented. Dense embeddings trained from KG datasets benefit a variety of downstream tasks such as KG completion and link prediction. However, existing KG embedding methods fell short to provide a systematic solution for the global consistency of knowledge representation. We developed a mathematical language for KG based on an observation of their inherent algebraic structure, which we termed as Knowledgebra. B
APA, Harvard, Vancouver, ISO, and other styles
6

Makarov, Ilya, Andrey Savchenko, Arseny Korovko, et al. "Temporal network embedding framework with causal anonymous walks representations." PeerJ Computer Science 8 (January 20, 2022): e858. http://dx.doi.org/10.7717/peerj-cs.858.

Full text
Abstract:
Many tasks in graph machine learning, such as link prediction and node classification, are typically solved using representation learning. Each node or edge in the network is encoded via an embedding. Though there exists a lot of network embeddings for static graphs, the task becomes much more complicated when the dynamic (i.e., temporal) network is analyzed. In this paper, we propose a novel approach for dynamic network representation learning based on Temporal Graph Network by using a highly custom message generating function by extracting Causal Anonymous Walks. We provide a benchmark pipel
APA, Harvard, Vancouver, ISO, and other styles
7

Cheng, Kewei, Xian Li, Yifan Ethan Xu, Xin Luna Dong, and Yizhou Sun. "PGE." Proceedings of the VLDB Endowment 15, no. 6 (2022): 1288–96. http://dx.doi.org/10.14778/3514061.3514074.

Full text
Abstract:
Although product graphs (PGs) have gained increasing attentions in recent years for their successful applications in product search and recommendations, the extensive power of PGs can be limited by the inevitable involvement of various kinds of errors. Thus, it is critical to validate the correctness of triples in PGs to improve their reliability. Knowledge graph (KG) embedding methods have strong error detection abilities. Yet, existing KG embedding methods may not be directly applicable to a PG due to its distinct characteristics: (1) PG contains rich textual signals, which necessitates a jo
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Yu, Yuan Tian, Jiawei Zhang, and Yi Chang. "Learning Signed Network Embedding via Graph Attention." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 4772–79. http://dx.doi.org/10.1609/aaai.v34i04.5911.

Full text
Abstract:
Learning the low-dimensional representations of graphs (i.e., network embedding) plays a critical role in network analysis and facilitates many downstream tasks. Recently graph convolutional networks (GCNs) have revolutionized the field of network embedding, and led to state-of-the-art performance in network analysis tasks such as link prediction and node classification. Nevertheless, most of the existing GCN-based network embedding methods are proposed for unsigned networks. However, in the real world, some of the networks are signed, where the links are annotated with different polarities, e
APA, Harvard, Vancouver, ISO, and other styles
9

Zhu, Shijie, Jianxin Li, Hao Peng, Senzhang Wang, and Lifang He. "Adversarial Directed Graph Embedding." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 5 (2021): 4741–48. http://dx.doi.org/10.1609/aaai.v35i5.16605.

Full text
Abstract:
Node representation learning for directed graphs is critically important to facilitate many graph mining tasks. To capture the directed edges between nodes, existing methods mostly learn two embedding vectors for each node, source vector and target vector. However, these methods learn the source and target vectors separately. For the node with very low indegree or outdegree, the corresponding target vector or source vector cannot be effectively learned. In this paper, we propose a novel Directed Graph embedding framework based on Generative Adversarial Network, called DGGAN. The main idea is t
APA, Harvard, Vancouver, ISO, and other styles
10

Hong, Xiaobin, Tong Zhang, Zhen Cui, et al. "Graph Game Embedding." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 9 (2021): 7711–20. http://dx.doi.org/10.1609/aaai.v35i9.16942.

Full text
Abstract:
Graph embedding aims to encode nodes/edges into low-dimensional continuous features, and has become a crucial tool for graph analysis including graph/node classification, link prediction, etc. In this paper we propose a novel graph learning framework, named graph game embedding, to learn discriminative node representation as well as encode graph structures. Inspired by the spirit of game learning, node embedding is converted to the selection/searching process of player strategies, where each node corresponds to one player and each edge corresponds to the interaction of two players. Then, a uti
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Graph embedding framework"

1

Prouteau, Thibault. "Graphs,Words, and Communities : converging paths to interpretability with a frugal embedding framework." Electronic Thesis or Diss., Le Mans, 2024. http://www.theses.fr/2024LEMA1006.

Full text
Abstract:
L'apprentissage de représentations au travers des méthodes de plongements de mots (word embedding) et de graphes (graph embedding) permet des représentations distribuées de l'information. Ces représentations peuvent à leur tour être utilisées en entrée d'algorithmes d'apprentissage automatique. Au cours des deux dernières décennies, les tâches de plongement de nœuds et de mots sont passées d'approches par factorisation matricielle qui pouvaient être réalisées en quelques minutes à de grands modèles nécessitant des quantités toujours plus importantes de données d’apprentissage et parfois des se
APA, Harvard, Vancouver, ISO, and other styles
2

Fang, Chunsheng. "Novel Frameworks for Mining Heterogeneous and Dynamic Networks." University of Cincinnati / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1321369978.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Graph embedding framework"

1

Gao, Jing, Nan Du, Wei Fan, Deepak Turaga, Srinivasan Parthasarathy, and Jiawei Han. "A Multi-graph Spectral Framework for Mining Multi-source Anomalies." In Graph Embedding for Pattern Analysis. Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-4457-2_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hafiane, Rachid, Luc Brun, and Salvatore Tabbone. "Incremental Embedding Within a Dissimilarity-Based Framework." In Graph-Based Representations in Pattern Recognition. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-18224-7_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Manzo, Mario, Simone Pellino, Alfredo Petrosino, and Alessandro Rozza. "A Novel Graph Embedding Framework for Object Recognition." In Computer Vision - ECCV 2014 Workshops. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-16220-1_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sun, Ding, Zhen Huang, Dongsheng Li, Xiangyu Ye, and Yilin Wang. "Improved Partitioning Graph Embedding Framework for Small Cluster." In Knowledge Science, Engineering and Management. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-82136-4_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jiang, Ting, Ting Yu, Xueting Qiao, and Ji Zhang. "An Efficient Embedding Framework for Uncertain Attribute Graph." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-39821-6_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sun, Guolei, and Xiangliang Zhang. "A Novel Framework for Node/Edge Attributed Graph Embedding." In Advances in Knowledge Discovery and Data Mining. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-16142-2_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cui, Qingyao, Yanquan Zhou, and Mingming Zheng. "Sememes-Based Framework for Knowledge Graph Embedding with Comprehensive-Information." In Knowledge Science, Engineering and Management. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-82147-0_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ning, Zhiyuan, Ziyue Qiao, Hao Dong, Yi Du, and Yuanchun Zhou. "LightCAKE: A Lightweight Framework for Context-Aware Knowledge Graph Embedding." In Advances in Knowledge Discovery and Data Mining. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-75768-7_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pellegrino, Maria Angela, Abdulrahman Altabba, Martina Garofalo, Petar Ristoski, and Michael Cochez. "GEval: A Modular and Extensible Evaluation Framework for Graph Embedding Techniques." In The Semantic Web. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-49461-2_33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Weng, Tengfan, Xiaoyu Kang, and Zhixin Shi. "APFedEmb: An Adaptive and Personalized Federated Knowledge Graph Embedding Framework for Link Prediction." In Lecture Notes in Computer Science. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-9872-1_33.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Graph embedding framework"

1

QIU, HaiXin, Zhaogong Zhang, Ning Wang, and Xin Guan. "Dual distillation knowledge embedding framework for efficient knowledge graph completion." In Fourth International Conference on Electronics Technology and Artificial Intelligence (ETAI 2025), edited by Shaohua Luo and Akash Saxena. SPIE, 2025. https://doi.org/10.1117/12.3068448.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bourgaux, Camille, Ricardo Guimarães, Raoul Koudijs, Victor Lacerda, and Ana Ozaki. "Knowledge Base Embeddings: Semantics and Theoretical Properties." In 21st International Conference on Principles of Knowledge Representation and Reasoning {KR-2023}. International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/kr.2024/77.

Full text
Abstract:
Research on knowledge graph embeddings has recently evolved into knowledge base embeddings, where the goal is not only to map facts into vector spaces but also constrain the models so that they take into account the relevant conceptual knowledge available. This paper examines recent methods that have been proposed to embed knowledge bases in description logic into vector spaces through the lens of their geometric-based semantics. We identify several relevant theoretical properties, which we draw from the literature and sometimes generalize or unify. We then investigate how concrete embedding m
APA, Harvard, Vancouver, ISO, and other styles
3

Ma, Tianliang, Guangxi Fan, Xuguang Sun, Zhihui Deng, Kain lu Low, and Leilai Shao. "Fast Design Technology Co-Optimization Framework for Emerging Technology with Hierarchical Graph Embedding." In 2024 2nd International Symposium of Electronics Design Automation (ISEDA). IEEE, 2024. http://dx.doi.org/10.1109/iseda62518.2024.10617794.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Wenkang, Fangkun Li, and Yang Li. "A Dual-Graph Learning Framework with Sparse Adaptive Embedding for EEG Emotion Recognition." In 2024 4th International Conference on Industrial Automation, Robotics and Control Engineering (IARCE). IEEE, 2024. https://doi.org/10.1109/iarce64300.2024.00053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lu, Yuhuan, Weijian Yu, Xin Jing, and Dingqi Yang. "HyperCL: A Contrastive Learning Framework for Hyper-Relational Knowledge Graph Embedding with Hierarchical Ontology." In Findings of the Association for Computational Linguistics ACL 2024. Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tu, Senbo, Zhihao Yang, Lei Wang, et al. "Efficient Knowledge Graph Embedding Framework to Alleviate Data Sparsity for Polypharmacy Side Effects Prediction." In 2024 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2024. https://doi.org/10.1109/bibm62325.2024.10822260.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mohan, Manoj Krishna, and Sahana Shreedhar Kulkarni. "Temporal-Aware Fraud Detection Using Knowledge Graph, Embeddings and Variable Change Analysis: An Evidence Based Risk Scoring Framework." In 2025 6th International Conference on Artificial Intelligence, Robotics and Control (AIRC). IEEE, 2025. https://doi.org/10.1109/airc64931.2025.11077540.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bai, Yunsheng, Hao Ding, Yang Qiao, et al. "Unsupervised Inductive Graph-Level Representation Learning via Graph-Graph Proximity." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/275.

Full text
Abstract:
We introduce a novel approach to graph-level representation learning, which is to embed an entire graph into a vector space where the embeddings of two graphs preserve their graph-graph proximity. Our approach, UGraphEmb, is a general framework that provides a novel means to performing graph-level embedding in a completely unsupervised and inductive manner. The learned neural network can be considered as a function that receives any graph as input, either seen or unseen in the training set, and transforms it into an embedding. A novel graph-level embedding generation mechanism called Multi-Sca
APA, Harvard, Vancouver, ISO, and other styles
9

Pan, Shirui, Ruiqi Hu, Guodong Long, Jing Jiang, Lina Yao, and Chengqi Zhang. "Adversarially Regularized Graph Autoencoder for Graph Embedding." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/362.

Full text
Abstract:
Graph embedding is an effective method to represent graph data in a low dimensional space for graph analytics. Most existing embedding algorithms typically focus on preserving the topological structure or minimizing the reconstruction errors of graph data, but they have mostly ignored the data distribution of the latent codes from the graphs, which often results in inferior embedding in real-world graph data. In this paper, we propose a novel adversarial graph embedding framework for graph data. The framework encodes the topological structure and node content in a graph to a compact representa
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Yizhou, Guojie Song, Lun Du, Shuwen Yang, and Yilun Jin. "DANE: Domain Adaptive Network Embedding." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/606.

Full text
Abstract:
Recent works reveal that network embedding techniques enable many machine learning models to handle diverse downstream tasks on graph structured data. However, as previous methods usually focus on learning embeddings for a single network, they can not learn representations transferable on multiple networks. Hence, it is important to design a network embedding algorithm that supports downstream model transferring on different networks, known as domain adaptation. In this paper, we propose a novel Domain Adaptive Network Embedding framework, which applies graph convolutional network to learn tra
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!