Academic literature on the topic 'SE(3) equivariant graph neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'SE(3) equivariant graph neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "SE(3) equivariant graph neural network"

1

Bånkestad, Maria, Kevin M. Dorst, Göran Widmalm, and Jerk Rönnols. "Carbohydrate NMR chemical shift prediction by GeqShift employing E(3) equivariant graph neural networks." RSC Advances 14, no. 36 (2024): 26585–95. http://dx.doi.org/10.1039/d4ra03428g.

Full text
Abstract:
Visual abstract of GeqShift, an E(3) equivariant graph neural network for predicting carbohydrate NMR shifts. The model excels in stereochemical invariance, offering superior molecular geometry understanding over traditional methods.
APA, Harvard, Vancouver, ISO, and other styles
2

Roche, Rahmatullah, Bernard Moussad, Md Hossain Shuvo, and Debswapna Bhattacharya. "E(3) equivariant graph neural networks for robust and accurate protein-protein interaction site prediction." PLOS Computational Biology 19, no. 8 (2023): e1011435. http://dx.doi.org/10.1371/journal.pcbi.1011435.

Full text
Abstract:
Artificial intelligence-powered protein structure prediction methods have led to a paradigm-shift in computational structural biology, yet contemporary approaches for predicting the interfacial residues (i.e., sites) of protein-protein interaction (PPI) still rely on experimental structures. Recent studies have demonstrated benefits of employing graph convolution for PPI site prediction, but ignore symmetries naturally occurring in 3-dimensional space and act only on experimental coordinates. Here we present EquiPPIS, an E(3) equivariant graph neural network approach for PPI site prediction. E
APA, Harvard, Vancouver, ISO, and other styles
3

Han, Rong, Wenbing Huang, Lingxiao Luo, et al. "HeMeNet: Heterogeneous Multichannel Equivariant Network for Protein Multi-task Learning." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 1 (2025): 237–45. https://doi.org/10.1609/aaai.v39i1.32000.

Full text
Abstract:
Understanding and leveraging the 3D structures of proteins is central to a variety of biological and drug discovery tasks. While deep learning has been applied successfully for structure-based protein function prediction tasks, current methods usually employ distinct training for each task. However, each of the tasks is of small size, and such a single-task strategy hinders the models' performance and generalization ability. As some labeled 3D protein datasets are biologically related, combining multi-source datasets for larger-scale multi-task learning is one way to overcome this problem. In
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Jie, Michael J. Roy, Luke Isbel, and Fuyi Li. "Accurate PROTAC-targeted degradation prediction with DegradeMaster." Bioinformatics 41, Supplement_1 (2025): i342—i351. https://doi.org/10.1093/bioinformatics/btaf191.

Full text
Abstract:
Abstract Motivation Proteolysis-targeting chimeras (PROTACs) are heterobifunctional molecules that can degrade “undruggable” protein of interest by recruiting E3 ligases and hijacking the ubiquitin-proteasome system. Some efforts have been made to develop deep learning-based approaches to predict the degradation ability of a given PROTAC. However, existing deep learning methods either simplify proteins and PROTACs as 2D graphs by disregarding crucial 3D spatial information or exclusively rely on limited labels for supervised learning without considering the abundant information from unlabeled
APA, Harvard, Vancouver, ISO, and other styles
5

Zeng, Wenwu, Liangrui Pan, Boya Ji, Liwen Xu, and Shaoliang Peng. "Accurate Nucleic Acid-Binding Residue Identification Based Domain-Adaptive Protein Language Model and Explainable Geometric Deep Learning." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 1 (2025): 1004–12. https://doi.org/10.1609/aaai.v39i1.32086.

Full text
Abstract:
Protein-nucleic acid interactions play a fundamental and critical role in a wide range of life activities. Accurate identification of nucleic acid-binding residues helps to understand the intrinsic mechanisms of the interactions. However, the accuracy and interpretability of existing computational methods for recognizing nucleic acid-binding residues need to be further improved. Here, we propose a novel method called GeSite based the domain-adaptive protein language model and E(3)-equivariant graph neural network. Prediction results across multiple benchmark test sets demonstrate that GeSite i
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Hanchen, Defu Lian, Ying Zhang, et al. "Binarized graph neural network." World Wide Web 24, no. 3 (2021): 825–48. http://dx.doi.org/10.1007/s11280-021-00878-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Zhiqiang, Yang Chen, Xiaolong Zou, and Shan Yu. "Continuous Rotation Group Equivariant Network Inspired by Neural Population Coding." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 10 (2024): 11462–70. http://dx.doi.org/10.1609/aaai.v38i10.29027.

Full text
Abstract:
Neural population coding can represent continuous information by neurons with a series of discrete preferred stimuli, and we find that the bell-shaped tuning curve plays an important role in this mechanism. Inspired by this, we incorporate a bell-shaped tuning curve into the discrete group convolution to achieve continuous group equivariance. Simply, we modulate group convolution kernels by Gauss functions to obtain bell-shaped tuning curves. Benefiting from the modulation, kernels also gain smooth gradients on geometric dimensions (e.g., location dimension and orientation dimension). It allow
APA, Harvard, Vancouver, ISO, and other styles
8

Zeyu, Wang, Zhu Yue, Li Zichao, Wang Zhuoyue, Qin Hao, and Liu Xinqi. "Graph Neural Network Recommendation System for Football Formation." Applied Science and Biotechnology Journal for Advanced Research 3, no. 3 (2024): 33–39. https://doi.org/10.5281/zenodo.12198843.

Full text
Abstract:
In usual, the flow of a football game have different phase, and change from one to another, and the coach is due to observe them, understand and solve the tasks in the game by using appropriate structural strategies.Therefore, it is a critical issues for a coach to decide what kind of structural strategies have been effective for their own team. Therefore, we propose 3 different views to help to the coach to make decisions. First of views, we formulate the passing ball path as a network (passing net- work. More specific, we utilize clustering coeffcient to determine the relations between playe
APA, Harvard, Vancouver, ISO, and other styles
9

Zhou, Yuchen, Hongtao Huo, Zhiwen Hou, and Fanliang Bu. "A deep graph convolutional neural network architecture for graph classification." PLOS ONE 18, no. 3 (2023): e0279604. http://dx.doi.org/10.1371/journal.pone.0279604.

Full text
Abstract:
Graph Convolutional Networks (GCNs) are powerful deep learning methods for non-Euclidean structure data and achieve impressive performance in many fields. But most of the state-of-the-art GCN models are shallow structures with depths of no more than 3 to 4 layers, which greatly limits the ability of GCN models to extract high-level features of nodes. There are two main reasons for this: 1) Overlaying too many graph convolution layers will lead to the problem of over-smoothing. 2) Graph convolution is a kind of localized filter, which is easily affected by local properties. To solve the above p
APA, Harvard, Vancouver, ISO, and other styles
10

Kang, Shuang, Lin Shi, and Zhenyou Zhang. "Knowledge Graph Double Interaction Graph Neural Network for Recommendation Algorithm." Applied Sciences 12, no. 24 (2022): 12701. http://dx.doi.org/10.3390/app122412701.

Full text
Abstract:
To solve the problem that recommendation algorithms based on knowledge graph ignore the information of the entity itself and the user information during information aggregating, we propose a double interaction graph neural network recommendation algorithm based on knowledge graph. First, items in the dataset are selected as user-related items and then they are integrated into user features, which are enriched. Then, according to different user relationship weights and the influence weights of neighbor entities on the central entity, the graph neural network is used to integrate the features of
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "SE(3) equivariant graph neural network"

1

Pezzicoli, Francesco. "Statistical Physics - Machine Learning Interplay : from Addressing Class Imbalance with Replica Theory to Predicting Dynamical Heterogeneities with SE(3)-equivariant Graph Neural Networks." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG115.

Full text
Abstract:
Cette thèse explore la relation entre l'Apprentissage Automatique (AA) et la Physique Statistique (PS), en abordant deux défis importants à l'interface entre ces deux domaines. Tout d'abord, j'examine le problème du Déséquilibre de Classe (DC) dans le cadre de l'apprentissage supervisé en introduisant un modèle analytiquement solvable basé sur la mécanique statistique: je propose un cadre théorique pour analyser et interpréter le problème de DC. Certains phénomènes non triviaux sont observés : par exemple, un ensemble d'entraînement équilibré aboutit souvent à une performance sous-optimale. En
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "SE(3) equivariant graph neural network"

1

Toshev, Artur P., Gianluca Galletti, Johannes Brandstetter, Stefan Adami, and Nikolaus A. Adams. "Learning Lagrangian Fluid Mechanics with E(3)-Equivariant Graph Neural Networks." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-38299-4_35.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yu, Changqian, Yifan Liu, Changxin Gao, Chunhua Shen, and Nong Sang. "Representative Graph Neural Network." In Computer Vision – ECCV 2020. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58571-6_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Yaoman, and Irwin King. "AutoGraph: Automated Graph Neural Network." In Neural Information Processing. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63833-7_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Himmelhuber, Anna, Mitchell Joblin, Martin Ringsquandl, and Thomas Runkler. "Demystifying Graph Neural Network Explanations." In Communications in Computer and Information Science. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-93736-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Guo, Mengying, Zhenyu Sun, Yuyi Wang, and Xingwu Liu. "Graph Neural Network with Neighborhood Reconnection." In Knowledge Science, Engineering and Management. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-40283-8_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Meyer, Bernd. "Self-Organizing Graphs — A Neural Network Perspective of Graph Layout." In Graph Drawing. Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/3-540-37623-2_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Qi, Jianxia Chen, Shuxi Zhang, Chang Liu, and Xinyun Wu. "Sequence Recommendation Based on Interactive Graph Attention Network." In Neural Information Processing. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-30108-7_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Tang, Maolin, and Chien-An Chen. "Wireless Network Gateway Placement by Evolutionary Graph Clustering." In Neural Information Processing. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70090-8_91.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhuo, Yuxin, Xuesi Zhou, and Ji Wu. "Training Graph Convolutional Neural Network Against Label Noise." In Neural Information Processing. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-92238-2_56.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tien, Dong Nguyen, and Hai Pham Van. "Graph Neural Network Combined Knowledge Graph for Recommendation System." In Computational Data and Social Networks. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-66046-8_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "SE(3) equivariant graph neural network"

1

Aykent, Sarp, and Tian Xia. "SE(3) Equivariant Neural Network for 3D Graphs." In 2024 IEEE International Conference on Big Data (BigData). IEEE, 2024. https://doi.org/10.1109/bigdata62323.2024.10825716.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Misik, Adam, Driton Salihu, Xin Su, Heike Brock, and Eckehard Steinbach. "HEGN: Hierarchical Equivariant Graph Neural Network for 9DoF Point Cloud Registration." In 2024 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2024. http://dx.doi.org/10.1109/icra57147.2024.10610562.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dai, Huanhuan, Haonan Song, Tongyu Han, Qing Yang, Xiangyu Meng, and Xun Wang. "AEG-PPIS: A Dual-Branch Protein-protein Interaction Site Predictor Based on Augmented Graph Attention Network and Equivariant Graph Neural Network." In 2024 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2024. https://doi.org/10.1109/bibm62325.2024.10822268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Medina, Edgar Ivan Sanchez, Ann-Joelle Minor, and Kai Sundmacher. "Systematic comparison between Graph Neural Networks and UNIFAC-IL for solvent pre-selection in liquid-liquid extraction." In The 35th European Symposium on Computer Aided Process Engineering. PSE Press, 2025. https://doi.org/10.69997/sct.132577.

Full text
Abstract:
Solvent selection is a critical decision-making process that balances economic, environmental, and societal factors. The vast chemical space makes evaluating all potential solvents impractical, necessitating pre-selection strategies to identify promising candidates. Predictive thermodynamic models, such as the UNIFAC model, are commonly used for this purpose. Recent advancements in deep learning have led to models like the Gibbs-Helmholtz Graph Neural Network (GH-GNN), which overall offers higher accuracy in predicting infinite dilution activity coefficients over a broader chemical space than
APA, Harvard, Vancouver, ISO, and other styles
5

Meng, Ziqiao, Liang Zeng, Zixing Song, Tingyang Xu, Peilin Zhao, and Irwin King. "Towards Geometric Normalization Techniques in SE(3) Equivariant Graph Neural Networks for Physical Dynamics Simulations." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/661.

Full text
Abstract:
SE(3) equivariance is a fundamental property that is highly desirable to maintain in physical dynamics modeling. This property ensures neural outputs to remain robust when the inputs are translated or rotated. Recently, there have been several proposals for SE(3) equivariant graph neural networks (GNNs) that have shown promising results in simulating particle dynamics. However, existing works have neglected an important issue that current SE(3) equivariant GNNs cannot scale to large particle systems. Although some simple normalization techniques are already in use to stabilize the training dyn
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Jianfei, Ai-Te Kuo, Jianan Zhao, et al. "Rx-refill Graph Neural Network to Reduce Drug Overprescribing Risks (Extended Abstract)." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/755.

Full text
Abstract:
Prescription (aka Rx) drugs can be easily overprescribed and lead to drug abuse or opioid overdose. Accordingly, a state-run prescription drug monitoring program (PDMP) in the United States has been developed to reduce overprescribing. However, PDMP has limited capability in detecting patients' potential overprescribing behaviors, impairing its effectiveness in preventing drug abuse and overdose in patients. In this paper, we propose a novel model RxNet, which builds 1) a dynamic heterogeneous graph to model Rx refills that are essentially prescribing and dispensing (P&D) relationships amo
APA, Harvard, Vancouver, ISO, and other styles
7

Bitencourt, Jaqueline, and Anderson Tavares. "A Comparative Study of Graph Neural Network Models for Drug-Target Interaction Prediction." In Simpósio Brasileiro de Computação Aplicada à Saúde. Sociedade Brasileira de Computação - SBC, 2025. https://doi.org/10.5753/sbcas.2025.7729.

Full text
Abstract:
Accurately predicting drug-target interactions (DTI) is crucial for computational drug discovery, yet there’s a research gap in evaluating existing graph neural network (GNN) models rather than developing novel architectures. This study provides a comparative analysis of three state-of-the-art GNN architectures – GraphSAGE, Graph Attention Network (GAT), and Graph Isomorphism Network (GIN) – for predicting interactions between chemical compounds and five protein targets. Using a dataset of 73,938 samples representing interactions between compounds and five protein targets derived from PubChem,
APA, Harvard, Vancouver, ISO, and other styles
8

Molokwu, Bonaventure. "Event Prediction in Complex Social Graphs using One-Dimensional Convolutional Neural Network." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/914.

Full text
Abstract:
Social network graphs possess apparent and latent knowledge about their respective actors and links which may be exploited, using effective and efficient techniques, for predicting events within the social graphs. Understanding the intrinsic relationship patterns among spatial social actors and their respective properties are crucial factors to be taken into consideration in event prediction within social networks. My research work proposes a unique approach for predicting events in social networks by learning the context of each actor/vertex using neighboring actors in a given social graph wi
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Zihan, Zhaochun Ren, Chunyu He, Peng Zhang, and Yue Hu. "Robust Embedding with Multi-Level Structures for Link Prediction." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/728.

Full text
Abstract:
Knowledge Graph (KG) embedding has become crucial for the task of link prediction. Recent work applies encoder-decoder models to tackle this problem, where an encoder is formulated as a graph neural network (GNN) and a decoder is represented by an embedding method. These approaches enforce embedding techniques with structure information. Unfortunately, existing GNN-based frameworks still confront 3 severe problems: low representational power, stacking in a flat way, and poor robustness to noise. In this work, we propose a novel multi-level graph neural network (M-GNN) to address the above chal
APA, Harvard, Vancouver, ISO, and other styles
10

Mehrabian, Abbas, Ankit Anand, Hyunjik Kim, et al. "Finding Increasingly Large Extremal Graphs with AlphaZero and Tabu Search." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/772.

Full text
Abstract:
This work proposes a new learning-to-search benchmark and uses AI to discover new mathematical knowledge related to an open conjecture of Erdos (1975) in extremal graph theory. The problem is to find graphs with a given size (number of nodes) that maximize the number of edges without having 3- or 4-cycles. We formulate this as a sequential decision-making problem and compare AlphaZero, a neural network-guided tree search, with tabu search, a heuristic local search method. Using either method, by introducing a curriculum---jump-starting the search for larger graphs using good graphs found at sm
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!