To see the other types of publications on this topic, follow the link: SE(3) equivariant graph neural network.

Journal articles on the topic 'SE(3) equivariant graph neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'SE(3) equivariant graph neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Bånkestad, Maria, Kevin M. Dorst, Göran Widmalm, and Jerk Rönnols. "Carbohydrate NMR chemical shift prediction by GeqShift employing E(3) equivariant graph neural networks." RSC Advances 14, no. 36 (2024): 26585–95. http://dx.doi.org/10.1039/d4ra03428g.

Full text
Abstract:
Visual abstract of GeqShift, an E(3) equivariant graph neural network for predicting carbohydrate NMR shifts. The model excels in stereochemical invariance, offering superior molecular geometry understanding over traditional methods.
APA, Harvard, Vancouver, ISO, and other styles
2

Roche, Rahmatullah, Bernard Moussad, Md Hossain Shuvo, and Debswapna Bhattacharya. "E(3) equivariant graph neural networks for robust and accurate protein-protein interaction site prediction." PLOS Computational Biology 19, no. 8 (2023): e1011435. http://dx.doi.org/10.1371/journal.pcbi.1011435.

Full text
Abstract:
Artificial intelligence-powered protein structure prediction methods have led to a paradigm-shift in computational structural biology, yet contemporary approaches for predicting the interfacial residues (i.e., sites) of protein-protein interaction (PPI) still rely on experimental structures. Recent studies have demonstrated benefits of employing graph convolution for PPI site prediction, but ignore symmetries naturally occurring in 3-dimensional space and act only on experimental coordinates. Here we present EquiPPIS, an E(3) equivariant graph neural network approach for PPI site prediction. E
APA, Harvard, Vancouver, ISO, and other styles
3

Han, Rong, Wenbing Huang, Lingxiao Luo, et al. "HeMeNet: Heterogeneous Multichannel Equivariant Network for Protein Multi-task Learning." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 1 (2025): 237–45. https://doi.org/10.1609/aaai.v39i1.32000.

Full text
Abstract:
Understanding and leveraging the 3D structures of proteins is central to a variety of biological and drug discovery tasks. While deep learning has been applied successfully for structure-based protein function prediction tasks, current methods usually employ distinct training for each task. However, each of the tasks is of small size, and such a single-task strategy hinders the models' performance and generalization ability. As some labeled 3D protein datasets are biologically related, combining multi-source datasets for larger-scale multi-task learning is one way to overcome this problem. In
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Jie, Michael J. Roy, Luke Isbel, and Fuyi Li. "Accurate PROTAC-targeted degradation prediction with DegradeMaster." Bioinformatics 41, Supplement_1 (2025): i342—i351. https://doi.org/10.1093/bioinformatics/btaf191.

Full text
Abstract:
Abstract Motivation Proteolysis-targeting chimeras (PROTACs) are heterobifunctional molecules that can degrade “undruggable” protein of interest by recruiting E3 ligases and hijacking the ubiquitin-proteasome system. Some efforts have been made to develop deep learning-based approaches to predict the degradation ability of a given PROTAC. However, existing deep learning methods either simplify proteins and PROTACs as 2D graphs by disregarding crucial 3D spatial information or exclusively rely on limited labels for supervised learning without considering the abundant information from unlabeled
APA, Harvard, Vancouver, ISO, and other styles
5

Zeng, Wenwu, Liangrui Pan, Boya Ji, Liwen Xu, and Shaoliang Peng. "Accurate Nucleic Acid-Binding Residue Identification Based Domain-Adaptive Protein Language Model and Explainable Geometric Deep Learning." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 1 (2025): 1004–12. https://doi.org/10.1609/aaai.v39i1.32086.

Full text
Abstract:
Protein-nucleic acid interactions play a fundamental and critical role in a wide range of life activities. Accurate identification of nucleic acid-binding residues helps to understand the intrinsic mechanisms of the interactions. However, the accuracy and interpretability of existing computational methods for recognizing nucleic acid-binding residues need to be further improved. Here, we propose a novel method called GeSite based the domain-adaptive protein language model and E(3)-equivariant graph neural network. Prediction results across multiple benchmark test sets demonstrate that GeSite i
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Hanchen, Defu Lian, Ying Zhang, et al. "Binarized graph neural network." World Wide Web 24, no. 3 (2021): 825–48. http://dx.doi.org/10.1007/s11280-021-00878-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Zhiqiang, Yang Chen, Xiaolong Zou, and Shan Yu. "Continuous Rotation Group Equivariant Network Inspired by Neural Population Coding." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 10 (2024): 11462–70. http://dx.doi.org/10.1609/aaai.v38i10.29027.

Full text
Abstract:
Neural population coding can represent continuous information by neurons with a series of discrete preferred stimuli, and we find that the bell-shaped tuning curve plays an important role in this mechanism. Inspired by this, we incorporate a bell-shaped tuning curve into the discrete group convolution to achieve continuous group equivariance. Simply, we modulate group convolution kernels by Gauss functions to obtain bell-shaped tuning curves. Benefiting from the modulation, kernels also gain smooth gradients on geometric dimensions (e.g., location dimension and orientation dimension). It allow
APA, Harvard, Vancouver, ISO, and other styles
8

Zeyu, Wang, Zhu Yue, Li Zichao, Wang Zhuoyue, Qin Hao, and Liu Xinqi. "Graph Neural Network Recommendation System for Football Formation." Applied Science and Biotechnology Journal for Advanced Research 3, no. 3 (2024): 33–39. https://doi.org/10.5281/zenodo.12198843.

Full text
Abstract:
In usual, the flow of a football game have different phase, and change from one to another, and the coach is due to observe them, understand and solve the tasks in the game by using appropriate structural strategies.Therefore, it is a critical issues for a coach to decide what kind of structural strategies have been effective for their own team. Therefore, we propose 3 different views to help to the coach to make decisions. First of views, we formulate the passing ball path as a network (passing net- work. More specific, we utilize clustering coeffcient to determine the relations between playe
APA, Harvard, Vancouver, ISO, and other styles
9

Zhou, Yuchen, Hongtao Huo, Zhiwen Hou, and Fanliang Bu. "A deep graph convolutional neural network architecture for graph classification." PLOS ONE 18, no. 3 (2023): e0279604. http://dx.doi.org/10.1371/journal.pone.0279604.

Full text
Abstract:
Graph Convolutional Networks (GCNs) are powerful deep learning methods for non-Euclidean structure data and achieve impressive performance in many fields. But most of the state-of-the-art GCN models are shallow structures with depths of no more than 3 to 4 layers, which greatly limits the ability of GCN models to extract high-level features of nodes. There are two main reasons for this: 1) Overlaying too many graph convolution layers will lead to the problem of over-smoothing. 2) Graph convolution is a kind of localized filter, which is easily affected by local properties. To solve the above p
APA, Harvard, Vancouver, ISO, and other styles
10

Kang, Shuang, Lin Shi, and Zhenyou Zhang. "Knowledge Graph Double Interaction Graph Neural Network for Recommendation Algorithm." Applied Sciences 12, no. 24 (2022): 12701. http://dx.doi.org/10.3390/app122412701.

Full text
Abstract:
To solve the problem that recommendation algorithms based on knowledge graph ignore the information of the entity itself and the user information during information aggregating, we propose a double interaction graph neural network recommendation algorithm based on knowledge graph. First, items in the dataset are selected as user-related items and then they are integrated into user features, which are enriched. Then, according to different user relationship weights and the influence weights of neighbor entities on the central entity, the graph neural network is used to integrate the features of
APA, Harvard, Vancouver, ISO, and other styles
11

Diaz, Ivan, Mario Geiger, and Richard Iain McKinley. "Leveraging SO(3)-steerable convolutions for pose-robust semantic segmentation in 3D medical data." Machine Learning for Biomedical Imaging 2, May 2024 (2024): 834–55. http://dx.doi.org/10.59275/j.melba.2024-7189.

Full text
Abstract:
Convolutional neural networks (CNNs) allow for parameter sharing and translational equivariance by using convolutional kernels in their linear layers. By restricting these kernels to be SO(3)-steerable, CNNs can further improve parameter sharing and equivariance. These equivariant convolutional layers have several advantages over standard convolutional layers, including increased robustness to unseen poses, smaller network size, and improved sample efficiency. Despite this, most segmentation networks used in medical image analysis continue to rely on standard convolutional kernels. In this pap
APA, Harvard, Vancouver, ISO, and other styles
12

Gu, Yaowen, Jiao Li, Hongyu Kang, Bowen Zhang, and Si Zheng. "Employing Molecular Conformations for Ligand-Based Virtual Screening with Equivariant Graph Neural Network and Deep Multiple Instance Learning." Molecules 28, no. 16 (2023): 5982. http://dx.doi.org/10.3390/molecules28165982.

Full text
Abstract:
Ligand-based virtual screening (LBVS) is a promising approach for rapid and low-cost screening of potentially bioactive molecules in the early stage of drug discovery. Compared with traditional similarity-based machine learning methods, deep learning frameworks for LBVS can more effectively extract high-order molecule structure representations from molecular fingerprints or structures. However, the 3D conformation of a molecule largely influences its bioactivity and physical properties, and has rarely been considered in previous deep learning-based LBVS methods. Moreover, the relative bioactiv
APA, Harvard, Vancouver, ISO, and other styles
13

Mi, Jia, Chang Li, Han Wang, et al. "USPDB: A novel U-shaped equivariant graph neural network with subgraph sampling for protein-DNA binding site prediction." Expert Systems with Applications 291 (October 2025): 128554. https://doi.org/10.1016/j.eswa.2025.128554.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Fastiuk, Y., and N. Huzynets. "OPTIMIZATION OF THE ALGORITHM FLOW GRAPH WIDTH IN NEURAL NETWORKS TO REDUCE THE USE OF PROCESSOR ELEMENTS ON SINGLE-BOARD COMPUTERS." Computer systems and network 6, no. 2 (2024): 228–38. https://doi.org/10.23939/csn2024.02.228.

Full text
Abstract:
The article presents a method for optimizing the algorithm flow graph of a deep neural network to reduce the number of processor elements (PE) required for executing the algorithm on single-board computers. The proposed approach is based on the use of a structural matrix to optimize the neural network architecture without loss of performance. The research demonstrated that by reducing the width of the graph, the number of processor elements was reduced from 3 to 2, while maintaining network performance at 75% efficiency. This approach is significant as it expands the potential applications of
APA, Harvard, Vancouver, ISO, and other styles
15

Fastiuk, Y., and N. Huzynets. "OPTIMIZATION OF THE ALGORITHM FLOW GRAPH WIDTH IN NEURAL NETWORKS TO REDUCE THE USE OF PROCESSOR ELEMENTS ON SINGLE-BOARD COMPUTERS." Computer systems and network 6, no. 2 (2024): 232–41. https://doi.org/10.23939/csn2024.02.232.

Full text
Abstract:
The article presents a method for optimizing the algorithm flow graph of a deep neural network to reduce the number of processor elements (PE) required for executing the algorithm on single-board computers. The proposed approach is based on the use of a structural matrix to optimize the neural network architecture without loss of performance. The research demonstrated that by reducing the width of the graph, the number of processor elements was reduced from 3 to 2, while maintaining network performance at 75% efficiency. This approach is significant as it expands the potential applications of
APA, Harvard, Vancouver, ISO, and other styles
16

Shumovskaia, Valentina, Kirill Fedyanin, Ivan Sukharev, Dmitry Berestnev, and Maxim Panov. "Linking bank clients using graph neural networks powered by rich transactional data." International Journal of Data Science and Analytics 12, no. 2 (2021): 135–45. http://dx.doi.org/10.1007/s41060-021-00247-3.

Full text
Abstract:
AbstractFinancial institutions obtain enormous amounts of data about client transactions and money transfers, which can be considered as a large graph dynamically changing in time. In this work, we focus on the task of predicting new interactions in the network of bank clients and treat it as a link prediction problem. We propose a new graph neural network model, which uses not only the topological structure of the network but rich time-series data available for the graph nodes and edges. We evaluate the developed method using the data provided by a large European bank for several years. The p
APA, Harvard, Vancouver, ISO, and other styles
17

Pei, Hongbin, Taile Chen, Chen A, et al. "HAGO-Net: Hierarchical Geometric Massage Passing for Molecular Representation Learning." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 13 (2024): 14572–80. http://dx.doi.org/10.1609/aaai.v38i13.29373.

Full text
Abstract:
Molecular representation learning has emerged as a game-changer at the intersection of AI and chemistry, with great potential in applications such as drug design and materials discovery. A substantial obstacle in successfully applying molecular representation learning is the difficulty of effectively and completely characterizing and learning molecular geometry, which has not been well addressed to date. To overcome this challenge, we propose a novel framework that features a novel geometric graph, termed HAGO-Graph, and a specifically designed geometric graph learning model, HAGO-Net. In the
APA, Harvard, Vancouver, ISO, and other styles
18

You, Jiaxuan, Jonathan M. Gomes-Selman, Rex Ying, and Jure Leskovec. "Identity-aware Graph Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (2021): 10737–45. http://dx.doi.org/10.1609/aaai.v35i12.17283.

Full text
Abstract:
Message passing Graph Neural Networks (GNNs) provide a powerful modeling framework for relational data. However, the expressive power of existing GNNs is upper-bounded by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test, which means GNNs that are not able to predict node clustering coefficients and shortest path distances, and cannot differentiate between different d-regular graphs. Here we develop a class of message passing GNNs, named Identity-aware Graph Neural Networks (ID-GNNs), with greater expressive power than the 1-WL test. ID-GNN offers a minimal but powerful solution to limitat
APA, Harvard, Vancouver, ISO, and other styles
19

Song, Jaeyoung, and Kiyun Yu. "Framework for Indoor Elements Classification via Inductive Learning on Floor Plan Graphs." ISPRS International Journal of Geo-Information 10, no. 2 (2021): 97. http://dx.doi.org/10.3390/ijgi10020097.

Full text
Abstract:
This paper presents a new framework to classify floor plan elements and represent them in a vector format. Unlike existing approaches using image-based learning frameworks as the first step to segment the image pixels, we first convert the input floor plan image into vector data and utilize a graph neural network. Our framework consists of three steps. (1) image pre-processing and vectorization of the floor plan image; (2) region adjacency graph conversion; and (3) the graph neural network on converted floor plan graphs. Our approach is able to capture different types of indoor elements includ
APA, Harvard, Vancouver, ISO, and other styles
20

Suárez-Varela, José, Miquel Ferriol-Galmés, Albert López, et al. "The graph neural networking challenge." ACM SIGCOMM Computer Communication Review 51, no. 3 (2021): 9–16. http://dx.doi.org/10.1145/3477482.3477485.

Full text
Abstract:
During the last decade, Machine Learning (ML) has increasingly become a hot topic in the field of Computer Networks and is expected to be gradually adopted for a plethora of control, monitoring and management tasks in real-world deployments. This poses the need to count on new generations of students, researchers and practitioners with a solid background in ML applied to networks. During 2020, the International Telecommunication Union (ITU) has organized the "ITU AI/ML in 5G challenge", an open global competition that has introduced to a broad audience some of the current main challenges in ML
APA, Harvard, Vancouver, ISO, and other styles
21

Yang, Shuai, Yueqin Zhang, and Zehua Zhang. "Runoff Prediction Based on Dynamic Spatiotemporal Graph Neural Network." Water 15, no. 13 (2023): 2463. http://dx.doi.org/10.3390/w15132463.

Full text
Abstract:
Runoff prediction plays an important role in the construction of intelligent hydraulic engineering. Most of the existing deep learning runoff prediction models use recurrent neural networks for single-step prediction of a single time series, which mainly model the temporal features and ignore the river convergence process within a watershed. In order to improve the accuracy of runoff prediction, a dynamic spatiotemporal graph neural network model (DSTGNN) is proposed considering the interaction of hydrological stations. The sequences are first input to the spatiotemporal block to extract spati
APA, Harvard, Vancouver, ISO, and other styles
22

Ma, Chen, Liheng Ma, Yingxue Zhang, Jianing Sun, Xue Liu, and Mark Coates. "Memory Augmented Graph Neural Networks for Sequential Recommendation." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 5045–52. http://dx.doi.org/10.1609/aaai.v34i04.5945.

Full text
Abstract:
The chronological order of user-item interactions can reveal time-evolving and sequential user behaviors in many recommender systems. The items that users will interact with may depend on the items accessed in the past. However, the substantial increase of users and items makes sequential recommender systems still face non-trivial challenges: (1) the hardness of modeling the short-term user interests; (2) the difficulty of capturing the long-term user interests; (3) the effective modeling of item co-occurrence patterns. To tackle these challenges, we propose a memory augmented graph neural net
APA, Harvard, Vancouver, ISO, and other styles
23

Liu, Tianrui, Qi Cai, Changxin Xu, et al. "Rumor Detection with A Novel Graph Neural Network Approach." Academic Journal of Science and Technology 10, no. 1 (2024): 305–10. http://dx.doi.org/10.54097/farmdr42.

Full text
Abstract:
The wide spread of rumors[3] on social media has caused a negative impact on people's daily life, leading to potential panic, fear and mental health problems for the public.[47] How to debunk rumors as early as possible remains a challenging problem. Existing studies mainly leverage information propagation structure to detect rumors[6], while very few works focus on correlation among users that they may coordinate to spread rumors in order to gain a large popularity. In this paper, we propose a new detection model, that jointly learns both the representations of user correlation and informatio
APA, Harvard, Vancouver, ISO, and other styles
24

Li, Zimu, Zihan Pengmei, Han Zheng, Erik Thiede, Junyu Liu, and Risi Kondor. "Unifying O(3) Equivariant Neural Networks Design with Tensor-Network Formalism." Machine Learning: Science and Technology, May 10, 2024. http://dx.doi.org/10.1088/2632-2153/ad4a04.

Full text
Abstract:
Abstract Many learning tasks, including learning potential energy surfaces from ab initio calculations, involve global spatial symmetries and permutational symmetry between atoms or general particles. Equivariant graph neural networks are a standard approach to such problems, with one of the most successful methods employing tensor products between various tensors that transform under the spatial group. However, as the number of different tensors and the complexity of relationships between them increase, maintaining parsimony and equivariance becomes increasingly challenging. In this paper, we
APA, Harvard, Vancouver, ISO, and other styles
25

Zhong, Yang, Hongyu Yu, Mao Su, Xingao Gong, and Hongjun Xiang. "Transferable equivariant graph neural networks for the Hamiltonians of molecules and solids." npj Computational Materials 9, no. 1 (2023). http://dx.doi.org/10.1038/s41524-023-01130-4.

Full text
Abstract:
AbstractThis work presents an E(3) equivariant graph neural network called HamGNN, which can fit the electronic Hamiltonian matrix of molecules and solids by a complete data-driven method. Unlike invariant models that achieve equivariance approximately through data augmentation, HamGNN employs E(3) equivariant convolutions to construct the Hamiltonian matrix, ensuring strict adherence to all equivariant constraints inherent in the physical system. In contrast to previous models with limited transferability, HamGNN demonstrates exceptional accuracy on various datasets, including QM9 molecular d
APA, Harvard, Vancouver, ISO, and other styles
26

Batzner, Simon, Albert Musaelian, Lixin Sun, et al. "E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials." Nature Communications 13, no. 1 (2022). http://dx.doi.org/10.1038/s41467-022-29939-5.

Full text
Abstract:
AbstractThis work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aware models use invariant convolutions and only act on scalars, NequIP employs E(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of-the-art accuracy on a challenging and diverse set of molecules and materials
APA, Harvard, Vancouver, ISO, and other styles
27

Pezzicoli, Francesco Saverio, Guillaume Charpiat, and François Pascal Landes. "Rotation-equivariant graph neural networks for learning glassy liquids representations." SciPost Physics 16, no. 5 (2024). http://dx.doi.org/10.21468/scipostphys.16.5.136.

Full text
Abstract:
The difficult problem of relating the static structure of glassy liquids and their dynamics is a good target for Machine Learning, an approach which excels at finding complex patterns hidden in data. Indeed, this approach is currently a hot topic in the glassy liquids community, where the state of the art consists in Graph Neural Networks (GNNs), which have great expressive power but are heavy models and lack interpretability. Inspired by recent advances in the field of Machine Learning group-equivariant representations, we build a GNN that learns a robust representation of the glass’ static s
APA, Harvard, Vancouver, ISO, and other styles
28

Yin, Shi, Xinyang Pan, Xudong Zhu, et al. "Towards Harmonization of SO(3)-Equivariance and Expressiveness: a Hybrid Deep Learning Framework for Electronic-Structure Hamiltonian Prediction." Machine Learning: Science and Technology, October 30, 2024. http://dx.doi.org/10.1088/2632-2153/ad8d30.

Full text
Abstract:
Abstract Deep learning for predicting the electronic-structure Hamiltonian of quantum systems necessitates satisfying the covariance laws, among which achieving SO(3)-equivariance without sacrificing the non-linear expressive capability of networks remains unsolved. To navigate the harmonization between SO(3)-equivariance and expressiveness, we propose HarmoSE, a deep learning method synergizing two distinct categories of neural mechanisms as a two-stage encoding and regression framework. The first stage corresponds to group theory-based neural mechanisms with inherent SO(3)-equivariant proper
APA, Harvard, Vancouver, ISO, and other styles
29

Tianqi, Wu. "Atomic protein structure refinement using all-atom graph representations and SE(3)–equivariant graph neural networks." July 14, 2022. https://doi.org/10.5281/zenodo.6944580.

Full text
Abstract:
Three-dimensional (3D) protein structures reveal the fundamental information about protein function. The state-of-art protein structure prediction methods such as Alphafold are being widely used to predict structures of uncharacterized proteins in biomedical research. There is a significant need to further improve the quality and nativeness of the predicted structures to enhance their usability. Current machine learning methods of refining protein structures focus mostly on improving the back-bone quality of predicted structures without effectively leveraging and enhancing the conformation of
APA, Harvard, Vancouver, ISO, and other styles
30

Frey, Nathan C., Ryan Soklaski, Simon Axelrod, et al. "Neural scaling of deep chemical models." Nature Machine Intelligence, October 23, 2023. http://dx.doi.org/10.1038/s42256-023-00740-3.

Full text
Abstract:
AbstractMassive scale, in terms of both data availability and computation, enables important breakthroughs in key application areas of deep learning such as natural language processing and computer vision. There is emerging evidence that scale may be a key ingredient in scientific deep learning, but the importance of physical priors in scientific domains makes the strategies and benefits of scaling uncertain. Here we investigate neural-scaling behaviour in large chemical models by varying model and dataset sizes over many orders of magnitude, studying models with over one billion parameters, p
APA, Harvard, Vancouver, ISO, and other styles
31

Hao, Zichun, Raghav Kansal, Javier Duarte, and Nadezda Chernyavskaya. "Lorentz group equivariant autoencoders." European Physical Journal C 83, no. 6 (2023). http://dx.doi.org/10.1140/epjc/s10052-023-11633-5.

Full text
Abstract:
AbstractThere has been significant work recently in developing machine learning (ML) models in high energy physics (HEP) for tasks such as classification, simulation, and anomaly detection. Often these models are adapted from those designed for datasets in computer vision or natural language processing, which lack inductive biases suited to HEP data, such as equivariance to its inherent symmetries. Such biases have been shown to make models more performant and interpretable, and reduce the amount of training data needed. To that end, we develop the Lorentz group autoencoder (LGAE), an autoenco
APA, Harvard, Vancouver, ISO, and other styles
32

Roche, Rahmatullah, Bernard Moussad, Md Hossain Shuvo, Sumit Tarafder, and Debswapna Bhattacharya. "EquiPNAS: improved protein–nucleic acid binding site prediction using protein-language-model-informed equivariant deep graph neural networks." Nucleic Acids Research, January 28, 2024. http://dx.doi.org/10.1093/nar/gkae039.

Full text
Abstract:
Abstract Protein language models (pLMs) trained on a large corpus of protein sequences have shown unprecedented scalability and broad generalizability in a wide range of predictive modeling tasks, but their power has not yet been harnessed for predicting protein–nucleic acid binding sites, critical for characterizing the interactions between proteins and nucleic acids. Here, we present EquiPNAS, a new pLM-informed E(3) equivariant deep graph neural network framework for improved protein–nucleic acid binding site prediction. By combining the strengths of pLM and symmetry-aware deep graph learni
APA, Harvard, Vancouver, ISO, and other styles
33

Pakornchote, Teerachote, Annop Ektarawong, and Thiparat Chotibut. "StrainTensorNet: Predicting crystal structure elastic properties using SE(3)-equivariant graph neural networks." Physical Review Research 5, no. 4 (2023). http://dx.doi.org/10.1103/physrevresearch.5.043198.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Koker, Teddy, Keegan Quigley, Eric Taw, Kevin Tibbetts, and Lin Li. "Higher-order equivariant neural networks for charge density prediction in materials." npj Computational Materials 10, no. 1 (2024). http://dx.doi.org/10.1038/s41524-024-01343-1.

Full text
Abstract:
AbstractThe calculation of electron density distribution using density functional theory (DFT) in materials and molecules is central to the study of their quantum and macro-scale properties, yet accurate and efficient calculation remains a long-standing challenge. We introduce ChargE3Net, an E(3)-equivariant graph neural network for predicting electron density in atomic systems. ChargE3Net enables the learning of higher-order equivariant features to achieve high predictive accuracy and model expressivity. We show that ChargE3Net exceeds the performance of prior work on diverse sets of molecule
APA, Harvard, Vancouver, ISO, and other styles
35

Sheriff, Killian, Yifan Cao, and Rodrigo Freitas. "Chemical-motif characterization of short-range order with E(3)-equivariant graph neural networks." npj Computational Materials 10, no. 1 (2024). http://dx.doi.org/10.1038/s41524-024-01393-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Wang, Yanli, and Jianlin Cheng. "Reconstructing 3D chromosome structures from single-cell Hi-C data with SO(3)-equivariant graph neural networks." NAR Genomics and Bioinformatics 7, no. 1 (2025). https://doi.org/10.1093/nargab/lqaf027.

Full text
Abstract:
Abstract The spatial conformation of chromosomes and genomes of single cells is relevant to cellular function and useful for elucidating the mechanism underlying gene expression and genome methylation. The chromosomal contacts (i.e. chromosomal regions in spatial proximity) entailing the three-dimensional (3D) structure of the genome of a single cell can be obtained by single-cell chromosome conformation capture techniques, such as single-cell Hi-C (ScHi-C). However, due to the sparsity of chromosomal contacts in ScHi-C data, it is still challenging for traditional 3D conformation optimization
APA, Harvard, Vancouver, ISO, and other styles
37

Florian, Hinz, Amr Mahmoud, and Markus Lill. "Prediction of Molecular Field Points using SE(3)-Transformer Model." Machine Learning: Science and Technology, July 11, 2023. http://dx.doi.org/10.1088/2632-2153/ace67b.

Full text
Abstract:
Abstract Due to their computational efficiency, 2D fingerprints are typically used in similarity-based high-content screening. The interaction of a ligand with its target protein, however, relies on its physicochemical interactions in 3D space. Thus, ligands with different 2D scaffolds can bind to the same protein if these ligands share similar interaction patterns. Molecular fields can represent those interaction profiles. For efficiency, the extrema of those molecular fields, named field points, are used to quantify the ligand similarity in 3D. The calculation of field points involves the evaluation of
APA, Harvard, Vancouver, ISO, and other styles
38

Ma, Yuxing, Hongyu Yu, Yang Zhong, Shiyou Chen, Xingao Gong, and Hongjun Xiang. "Transferable machine learning approach for predicting electronic structures of charged defects." Applied Physics Letters 126, no. 4 (2025). https://doi.org/10.1063/5.0242683.

Full text
Abstract:
The study of electronic properties of charged defects plays a crucial role in advancing our understanding of how defects influence conductivity, magnetism, and optical behavior in various materials. However, despite its significance, research on large-scale defective systems has been hindered by the high computational cost associated with density functional theory (DFT). In this study, we propose HamGNN-Q, an E(3) equivariant graph neural network framework capable of accurately predicting DFT Hamiltonian matrices for diverse point defects with varying charges, utilizing a unified set of networ
APA, Harvard, Vancouver, ISO, and other styles
39

Li, Fenglei, Qiaoyu Hu, Yongqi Zhou, Hao Yang, and Fang Bai. "DiffPROTACs is a deep learning-based generator for proteolysis targeting chimeras." Briefings in Bioinformatics 25, no. 5 (2024). http://dx.doi.org/10.1093/bib/bbae358.

Full text
Abstract:
Abstract PROteolysis TArgeting Chimeras (PROTACs) has recently emerged as a promising technology. However, the design of rational PROTACs, especially the linker component, remains challenging due to the absence of structure–activity relationships and experimental data. Leveraging the structural characteristics of PROTACs, fragment-based drug design (FBDD) provides a feasible approach for PROTAC research. Concurrently, artificial intelligence–generated content has attracted considerable attention, with diffusion models and Transformers emerging as indispensable tools in this field. In response,
APA, Harvard, Vancouver, ISO, and other styles
40

Kim, Jihoo, Yoonho Jeong, Won June Kim, Eok Kyun Lee, and Insung S. Choi. "MolNet_Equi: A Chemically Intuitive, Rotation‐Equivariant Graph Neural Network." Chemistry – An Asian Journal, November 12, 2023. http://dx.doi.org/10.1002/asia.202300684.

Full text
Abstract:
Although deep‐learning (DL) models suggest unprecedented prediction capabilities in tackling various chemical problems, their demonstrated tasks have so far been limited to the scalar properties including the magnitude of vectorial properties, such as molecular dipole moments. A rotation‐equivariant MolNet_Equi model, proposed in this paper, understands and recognizes the molecular rotation in the 3D Euclidean space, and exhibits the ability to predict directional dipole moments in the rotation‐sensitive mode, as well as showing superior performance for the prediction of scalar properties. Thr
APA, Harvard, Vancouver, ISO, and other styles
41

Cremer, Julian. "Equivariant Graph Neural Networks for Toxicity Prediction." February 8, 2023. https://doi.org/10.26434/chemrxiv-2023-9kb55.

Full text
Abstract:
Predictive modeling for toxicity is a crucial step in the drug discovery pipeline. It can help filter out molecules with a high probability of failing in the early stages of de novo drug design. Thus, several machine learning (ML) models have been developed to predict the toxicity of molecules by combining classical ML techniques or deep neural networks with well-known molecular representations such as fingerprints or 2D graphs. But the more natural, accurate representation of molecules is expected to be defined in physical 3D space like in ab initio methods. Recent studies successfully used e
APA, Harvard, Vancouver, ISO, and other styles
42

Bihani, Vaibhav, Sajid Mannan, Utkarsh Pratiush, et al. "EGraFFBench: Evaluation of Equivariant Graph Neural Network Force Fields for Atomistic Simulations." Digital Discovery, 2024. http://dx.doi.org/10.1039/d4dd00027g.

Full text
Abstract:
Equivariant graph neural networks force fields (EGRAFFs) have shown great promise in modelling complex interactions in atomic systems by exploiting the graphs’ inherent symmetries. Recent works have led to a...
APA, Harvard, Vancouver, ISO, and other styles
43

Chatterjee, Suman, Sergio Sánchez Cruz, Robert Schöfbeck, and Dennis Schwarz. "Rotation-equivariant graph neural network for learning hadronic SMEFT effects." Physical Review D 109, no. 7 (2024). http://dx.doi.org/10.1103/physrevd.109.076012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Jiang, Chi, Yi Zhang, Yang Liu, and Jing Peng. "Tensor improve equivariant graph neural network for molecular dynamics prediction." Computational Biology and Chemistry, March 2024, 108053. http://dx.doi.org/10.1016/j.compbiolchem.2024.108053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jahin, Md Abrar, Md Akmol Masud, Md Wahiduzzaman Suva, M. F. Mridha, and Nilanjan Dey. "Lorentz-Equivariant Quantum Graph Neural Network for High-Energy Physics." IEEE Transactions on Artificial Intelligence, 2025, 1–11. https://doi.org/10.1109/tai.2025.3554461.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Wen, Mingjian, Matthew K. Horton, Jason M. Munro, Patrick Huck, and Kristin A. Persson. "An equivariant graph neural network for the elasticity tensors of all seven crystal systems." Digital Discovery, 2024. http://dx.doi.org/10.1039/d3dd00233k.

Full text
Abstract:
An equivariant graph neural network model enables the rapid and accurate prediction of complete fourth-rank elasticity tensors of inorganic materials, facilitating the discovery of materials with exceptional mechanical properties.
APA, Harvard, Vancouver, ISO, and other styles
47

Shen, Guanghao, Ziqi Zhang, Zhaohong Deng, et al. "ASCE-PPIS: A Protein-Protein Interaction Sites Predictor Based on Equivariant Graph Neural Network with Fusion of Structure-Aware Pooling and Graph Collapse." Bioinformatics, July 24, 2025. https://doi.org/10.1093/bioinformatics/btaf423.

Full text
Abstract:
Abstract Motivation Identifying protein-protein interaction sites constitutes a crucial step in understanding disease mechanisms and drug development. As experimental methods for PPIS identification are expensive and time-consuming, numerous computational screening approaches have been developed, among which graph neural network based methods have achieved remarkable progress in recent years. However, existing methods lack the utilization of interactions between amino acid molecules and fail to address the dense characteristics of protein graphs. Results We propose ASCE-PPIS, an equivariant gr
APA, Harvard, Vancouver, ISO, and other styles
48

Yi, Yiqiang, Xu Wan, Kangfei Zhao, Le Ou-Yang, and Peilin Zhao. "Equivariant Line Graph Neural Network for Protein-Ligand Binding Affinity Prediction." IEEE Journal of Biomedical and Health Informatics, 2024, 1–13. http://dx.doi.org/10.1109/jbhi.2024.3383245.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Dong, Luqi, Xuanlin Zhang, Ziduo Yang, Lei Shen, and Yunhao Lu. "Accurate piezoelectric tensor prediction with equivariant attention tensor graph neural network." npj Computational Materials 11, no. 1 (2025). https://doi.org/10.1038/s41524-025-01546-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Chen, Chen, Xiao Chen, Alex Morehead, Tianqi Wu, and Jianlin Cheng. "3D-equivariant graph neural networks for protein model quality assessment." Bioinformatics, January 13, 2023. http://dx.doi.org/10.1093/bioinformatics/btad030.

Full text
Abstract:
Abstract Motivation Quality assessment of predicted protein tertiary structure models plays an important role in ranking and using them. With the recent development of deep learning end-to-end protein structure prediction techniques for generating highly confident tertiary structures for most proteins, it is important to explore corresponding quality assessment strategies to evaluate and select the structural models predicted by them since these models have better quality and different properties than the models predicted by traditional tertiary structure prediction methods. Results We develop
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!