Academic literature on the topic 'Message-passing neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Message-passing neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Message-passing neural network"

1

You, Jiaxuan, Jonathan M. Gomes-Selman, Rex Ying, and Jure Leskovec. "Identity-aware Graph Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (2021): 10737–45. http://dx.doi.org/10.1609/aaai.v35i12.17283.

Full text
Abstract:
Message passing Graph Neural Networks (GNNs) provide a powerful modeling framework for relational data. However, the expressive power of existing GNNs is upper-bounded by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test, which means GNNs that are not able to predict node clustering coefficients and shortest path distances, and cannot differentiate between different d-regular graphs. Here we develop a class of message passing GNNs, named Identity-aware Graph Neural Networks (ID-GNNs), with greater expressive power than the 1-WL test. ID-GNN offers a minimal but powerful solution to limitat
APA, Harvard, Vancouver, ISO, and other styles
2

Nikolentzos, Giannis, Antoine Tixier, and Michalis Vazirgiannis. "Message Passing Attention Networks for Document Understanding." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 8544–51. http://dx.doi.org/10.1609/aaai.v34i05.6376.

Full text
Abstract:
Graph neural networks have recently emerged as a very effective framework for processing graph-structured data. These models have achieved state-of-the-art performance in many tasks. Most graph neural networks can be described in terms of message passing, vertex update, and readout functions. In this paper, we represent documents as word co-occurrence networks and propose an application of the message passing framework to NLP, the Message Passing Attention network for Document understanding (MPAD). We also propose several hierarchical variants of MPAD. Experiments conducted on 10 standard text
APA, Harvard, Vancouver, ISO, and other styles
3

Yoon, Kanghoon, Kibum Kim, Jinyoung Moon, and Chanyoung Park. "Unbiased Heterogeneous Scene Graph Generation with Relation-Aware Message Passing Neural Network." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 3 (2023): 3285–94. http://dx.doi.org/10.1609/aaai.v37i3.25435.

Full text
Abstract:
Recent scene graph generation (SGG) frameworks have focused on learning complex relationships among multiple objects in an image. Thanks to the nature of the message passing neural network (MPNN) that models high-order interactions between objects and their neighboring objects, they are dominant representation learning modules for SGG. However, existing MPNN-based frameworks assume the scene graph as a homogeneous graph, which restricts the context-awareness of visual relations between objects. That is, they overlook the fact that the relations tend to be highly dependent on the objects with w
APA, Harvard, Vancouver, ISO, and other styles
4

Klipfel, Astrid, Zied Bouraoui, Olivier Peltre, Yaël Fregier, Najwa Harrati, and Adlane Sayede. "Equivariant Message Passing Neural Network for Crystal Material Discovery." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 12 (2023): 14304–11. http://dx.doi.org/10.1609/aaai.v37i12.26673.

Full text
Abstract:
Automatic material discovery with desired properties is a fundamental challenge for material sciences. Considerable attention has recently been devoted to generating stable crystal structures. While existing work has shown impressive success on supervised tasks such as property prediction, the progress on unsupervised tasks such as material generation is still hampered by the limited extent to which the equivalent geometric representations of the same crystal are considered. To address this challenge, we propose EPGNN a periodic equivariant message-passing neural network that learns crystal la
APA, Harvard, Vancouver, ISO, and other styles
5

Girish, L., and M. L. Raviprakash. "Message Passing-Based Prediction of Unlabelled Node Embedding Using Graph Neural Network." International Journal of Innovative Science and Research Technology 8, no. 3 (2023): 136–44. https://doi.org/10.5281/zenodo.7735576.

Full text
Abstract:
- Graph neural network are a part of deep learning methods created to perform presumption on data described by graphs. Graph neural network is a neutral network that can straight away be applied to graphs. It provides a agreeable way for node level, edge level and graph level prediction tasks. Moreover, most GNN models do not account for long distance relationships in graphs and instead simply aggregate data from short distances (e.g., 1-hop neighbours) in each round. In this paper work, we carry out node classification using graphs which can be put into large graphs comprise of labelled and u
APA, Harvard, Vancouver, ISO, and other styles
6

Tan, Xiaosi, Weihong Xu, Kai Sun, et al. "Improving Massive MIMO Message Passing Detectors With Deep Neural Network." IEEE Transactions on Vehicular Technology 69, no. 2 (2020): 1267–80. http://dx.doi.org/10.1109/tvt.2019.2960763.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chu, Lon-Chan, and Benjamin W. Wah. "Optimal mapping of neural-network learning on message-passing multicomputers." Journal of Parallel and Distributed Computing 14, no. 3 (1992): 319–39. http://dx.doi.org/10.1016/0743-7315(92)90071-t.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Junyan, Wei Xiao, Xinmei Li, et al. "A Routing Optimization Method for Software-Defined Optical Transport Networks Based on Ensembles and Reinforcement Learning." Sensors 22, no. 21 (2022): 8139. http://dx.doi.org/10.3390/s22218139.

Full text
Abstract:
Optical transport networks (OTNs) are widely used in backbone- and metro-area transmission networks to increase network transmission capacity. In the OTN, it is particularly crucial to rationally allocate routes and maximize network capacities. By employing deep reinforcement learning (DRL)- and software-defined networking (SDN)-based solutions, the capacity of optical networks can be effectively increased. However, because most DRL-based routing optimization methods have low sample usage and difficulty in coping with sudden network connectivity changes, converging in software-defined OTN scen
APA, Harvard, Vancouver, ISO, and other styles
9

Kim, Cheolhyeong, Haeseong Moon, and Hyung Ju Hwang. "NEAR: Neighborhood Edge AggregatoR for Graph Classification." ACM Transactions on Intelligent Systems and Technology 13, no. 3 (2022): 1–17. http://dx.doi.org/10.1145/3506714.

Full text
Abstract:
Learning graph-structured data with graph neural networks (GNNs) has been recently emerging as an important field because of its wide applicability in bioinformatics, chemoinformatics, social network analysis, and data mining. Recent GNN algorithms are based on neural message passing, which enables GNNs to integrate local structures and node features recursively. However, past GNN algorithms based on 1-hop neighborhood neural message passing are exposed to a risk of loss of information on local structures and relationships. In this article, we propose Neighborhood Edge AggregatoR (NEAR), a fra
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Feng, Wanyue Xiong, Zizhu Fan, and Licheng Sun. "Node Classification Method Based on Hierarchical Hypergraph Neural Network." Sensors 24, no. 23 (2024): 7655. https://doi.org/10.3390/s24237655.

Full text
Abstract:
Hypergraph neural networks have gained widespread attention due to their effectiveness in handling graph-structured data with complex relationships and multi-dimensional interactions. However, existing hypergraph neural network models mainly rely on planar message-passing mechanisms, which have limitations: (i) low efficiency in encoding long-distance information; (ii) underutilization of high-order neighborhood features, aggregating information only on the edges of the original graph. This paper proposes an innovative hierarchical hypergraph neural network (HCHG) to address these issues. The
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Message-passing neural network"

1

Boszorád, Matej. "Segmentace obrazových dat pomocí grafových neuronových sítí." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2020. http://www.nusl.cz/ntk/nusl-412987.

Full text
Abstract:
This diploma thesis describes and implements the design of a graph neural network usedfor 2D segmentation of neural structure. The first chapter of the thesis briefly introduces the problem of segmentation. In this chapter, segmentation techniques are divided according to the principles of the methods they use. Each type of technique contains the essence of this category as well as a description of one representative. The second chapter of the diploma thesis explains graph neural networks (GNN for short). Here, the thesis divides graph neural networks in general and describes recurrent graph n
APA, Harvard, Vancouver, ISO, and other styles
2

Swanson, Kyle(Kyle W. ). "Message passing neural networks for molecular property prediction." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/123133.

Full text
Abstract:
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Thesis: M. Eng. in Computer Science and Engineering, Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 81-84).<br>Developing new drugs relies heavily on understanding the various molecular properties of potential drug candidates. While experimental assays performed in the lab are the best sou
APA, Harvard, Vancouver, ISO, and other styles
3

Scotti, Andrea. "Graph Neural Networks and Learned Approximate Message Passing Algorithms for Massive MIMO Detection." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-284500.

Full text
Abstract:
Massive multiple-input and multiple-output (MIMO) is a method to improvethe performance of wireless communication systems by having a large numberof antennas at both the transmitter and the receiver. In the fifth-generation(5G) mobile communication system, Massive MIMO is a key technology toface the increasing number of mobile users and satisfy user demands. At thesame time, recovering the transmitted information in a massive MIMO uplinkreceiver requires more computational complexity when the number of transmittersincreases. Indeed, the optimal maximum likelihood (ML) detector hasa complexity
APA, Harvard, Vancouver, ISO, and other styles
4

Gabrié, Marylou. "Towards an understanding of neural networks : mean-field incursions." Thesis, Paris Sciences et Lettres (ComUE), 2019. http://www.theses.fr/2019PSLEE035.

Full text
Abstract:
Les algorithmes d’apprentissage automatique utilisant des réseaux de neurones profonds ont récemment révolutionné l'intelligence artificielle. Malgré l'engouement suscité par leurs diverses applications, les excellentes performances de ces algorithmes demeurent largement inexpliquées sur le plan théorique. Ces problèmes d'apprentissage sont décrits mathématiquement par de très grands ensembles de variables en interaction, difficiles à manipuler aussi bien analytiquement que numériquement. Cette multitude est précisément le champ d'étude de la physique statistique qui s'attelle à comprendre, or
APA, Harvard, Vancouver, ISO, and other styles
5

Aubin, Benjamin. "Mean-field methods and algorithmic perspectives for high-dimensional machine learning." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASP083.

Full text
Abstract:
À une époque où l'utilisation des données a atteint un niveau sans précédent, l'apprentissage machine, et plus particulièrement l'apprentissage profond basé sur des réseaux de neurones artificiels, a été responsable de très importants progrès pratiques. Leur utilisation est désormais omniprésente dans de nombreux domaines d'application, de la classification d'images à la reconnaissance vocale en passant par la prédiction de séries temporelles et l'analyse de texte. Pourtant, la compréhension de nombreux algorithmes utilisés en pratique est principalement empirique et leur comportement reste di
APA, Harvard, Vancouver, ISO, and other styles
6

Houliston, Trent James. "Software architecture and computer vision for resource constrained robotics." Thesis, 2018. http://hdl.handle.net/1959.13/1389336.

Full text
Abstract:
Research Doctorate - Doctor of Philosophy (PhD)<br>This thesis identifies the restrictions that resource-constrained robotic platforms experience in relation to their software architecture and computer vision. It proposes and evaluates a number of techniques to support more effective implementations. A review of robotic software shows that they often implement a message passing framework as their software architecture. This supports maintainability and modularity. However, the loose coupling afforded by message passing systems has a computational cost on performance and inhibits the ease of ac
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Message-passing neural network"

1

Zhang, Kai, Xueyong Xu, Chenchen Fu, Xiumin Wang, and Weiwei Wu. "Modeling Data Center Networks with Message Passing Neural Network and Multi-task Learning." In Neural Computing for Advanced Applications. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-5188-5_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Cong, Zhi Zheng, Tong Xu, Zikai Yin, and Enhong Chen. "Interaction-Aware Temporal Prescription Generation via Message Passing Neural Network." In Artificial Intelligence. Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-20500-2_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Velazquez-Ruiz, Leonardo, Graciela Ramirez-Alonso, Fernando Gaxiola, Javier Camarillo-Cisneros, Daniel Espinobarro, and Alain Manzo-Martinez. "Approximation of Physicochemical Properties Based on a Message Passing Neural Network Approach." In Hybrid Intelligent Systems Based on Extensions of Fuzzy Logic, Neural Networks and Metaheuristics. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-28999-6_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Song, Yongkang, Dianqing Liu, Dazhan Mao, and Yanqiu Shao. "Improving Text Matching with Semantic Dependency Graph via Message Passing Neural Network." In Lecture Notes in Computer Science. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78609-0_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chiorrini, Andrea, Claudia Diamantini, Alex Mircoli, and Domenico Potena. "Exploiting Instance Graphs and Graph Neural Networks for Next Activity Prediction." In Lecture Notes in Business Information Processing. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98581-3_9.

Full text
Abstract:
AbstractNowadays, a lot of data regarding business process executions are maintained in event logs. The next activity prediction task exploits such event logs to predict how process executions will unfold up until their completion. The present paper proposes a new approach to address this task: instead of using traces to perform predictions, we propose to use the instance graphs derived from traces. To make the most out of such representation we train a message passing neural network, specifically a Deep Graph Convolutional Neural Network to predict the next activity that will be performed in the process execution. The experiments performed show promising performance hinting that exploiting information about parallelism among activities in a process can induce a performance improvement in highly parallel process.
APA, Harvard, Vancouver, ISO, and other styles
6

Gilmer, Justin, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, and George E. Dahl. "Message Passing Neural Networks." In Machine Learning Meets Quantum Physics. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-40245-7_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yang, Dong, Tao Xiong, and Daguang Xu. "Automatic Vertebra Labeling in Large-Scale Medical Images Using Deep Image-to-Image Network with Message Passing and Sparsity Regularization." In Deep Learning and Convolutional Neural Networks for Medical Imaging and Clinical Informatics. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13969-8_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Heydari, Sajjad, and Lorenzo Livi. "Message Passing Neural Networks for Hypergraphs." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-15931-2_48.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Maltoni, Davide, and Erik M. Rehn. "Incremental Learning by Message Passing in Hierarchical Temporal Memory." In Artificial Neural Networks in Pattern Recognition. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-33212-8_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Quoy, Mathias, Sorin Moga, Philippe Gaussier, and Arnaud Revel. "Parallelization of Neural Networks Using PVM." In Recent Advances in Parallel Virtual Machine and Message Passing Interface. Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/3-540-45255-9_40.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Message-passing neural network"

1

Zhang, Xinyu, Qize Jiang, Hanyuan Zhang, and Weiwei Sun. "Position-aware Hypergraph Message-Passing Neural Network." In ICASSP 2025 - 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2025. https://doi.org/10.1109/icassp49660.2025.10888387.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liao, Yuxuan, Rui Wang, Jianhua Pei, Jian Song, and Yuhan Dong. "Message Passing Neural Network-Enhanced Optical OTFS Iterative Symbol Detection." In 2024 Asia Communications and Photonics Conference (ACP) and International Conference on Information Photonics and Optical Communications (IPOC). IEEE, 2024. https://doi.org/10.1109/acp/ipoc63121.2024.10809622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fang, Xiaoyu, Xiaoqing Ma, Cong Wang, et al. "Detecting and Exploring Malicious Websites through Multi-Message Passing Heterogeneous Neural Network." In 2024 IEEE Symposium on Computers and Communications (ISCC). IEEE, 2024. http://dx.doi.org/10.1109/iscc61673.2024.10733612.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Yufeng, Deyun Gao, Weiting Zhang, and Ping Dong. "Flow-MPNN: A Flow-Based Message-Passing Neural Network for Traffic Engineering." In GLOBECOM 2024 - 2024 IEEE Global Communications Conference. IEEE, 2024. https://doi.org/10.1109/globecom52923.2024.10901532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Duo, Andrea Araldo, and Maximilien Chau. "Public Transport Network Design for Equality of Accessibility via Message Passing Neural Networks and Reinforcement Learning." In 17th International Conference on Agents and Artificial Intelligence. SCITEPRESS - Science and Technology Publications, 2025. https://doi.org/10.5220/0013166000003890.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Mao, Longying, Zeyu Yang, Le Yao, Bingbing Shen, Xiaoyu Jiang, and Zhichao Chen. "Enhancing Industrial Soft Sensing via Optimized Message Passing in Spatial-Temporal Graph Neural Network." In 2025 IEEE 14th Data Driven Control and Learning Systems (DDCLS). IEEE, 2025. https://doi.org/10.1109/ddcls66240.2025.11065464.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cao, Yue, Shaoshi Yang, and Zhiyong Feng. "Distributed Cooperative Positioning in Dense Wireless Networks: A Neural Network Enhanced Fast Convergent Parametric Message Passing Method." In GLOBECOM 2024 - 2024 IEEE Global Communications Conference. IEEE, 2024. https://doi.org/10.1109/globecom52923.2024.10901824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kang, Zhaolei, and Changtao Wang. "An Aspect-Level Sentiment Analysis Method Based on Grammatical Knowledge and Message Passing Neural Network." In 2024 IEEE 6th International Conference on Power, Intelligent Computing and Systems (ICPICS). IEEE, 2024. https://doi.org/10.1109/icpics62053.2024.10796518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lu, Yangyi, Jing Peng, Yifeng Zhu, and Zhuang Chen. "Pre-training Molecular Graph Representations with Motif-Enhanced Message Passing." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650802.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Giusti, Lorenzo, Teodora Reu, Francesco Ceccarelli, Cristian Bodnar, and Pietro Liò. "Topological Message Passing for Higher - Order and Long - Range Interactions." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!