To see the other types of publications on this topic, follow the link: Physics Informed Neural Networks.

Journal articles on the topic 'Physics Informed Neural Networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Physics Informed Neural Networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Trahan, Corey, Mark Loveland, and Samuel Dent. "Quantum Physics-Informed Neural Networks." Entropy 26, no. 8 (July 30, 2024): 649. http://dx.doi.org/10.3390/e26080649.

Full text
Abstract:
In this study, the PennyLane quantum device simulator was used to investigate quantum and hybrid, quantum/classical physics-informed neural networks (PINNs) for solutions to both transient and steady-state, 1D and 2D partial differential equations. The comparative expressibility of the purely quantum, hybrid and classical neural networks is discussed, and hybrid configurations are explored. The results show that (1) for some applications, quantum PINNs can obtain comparable accuracy with less neural network parameters than classical PINNs, and (2) adding quantum nodes in classical PINNs can increase model accuracy with less total network parameters for noiseless models.
APA, Harvard, Vancouver, ISO, and other styles
2

Hofmann, Tobias, Jacob Hamar, Marcel Rogge, Christoph Zoerr, Simon Erhard, and Jan Philipp Schmidt. "Physics-Informed Neural Networks for State of Health Estimation in Lithium-Ion Batteries." Journal of The Electrochemical Society 170, no. 9 (September 1, 2023): 090524. http://dx.doi.org/10.1149/1945-7111/acf0ef.

Full text
Abstract:
One of the most challenging tasks of modern battery management systems is the accurate state of health estimation. While physico-chemical models are accurate, they have high computational cost. Neural networks lack physical interpretability but are efficient. Physics-informed neural networks tackle the aforementioned shortcomings by combining the efficiency of neural networks with the accuracy of physico-chemical models. A physics-informed neural network is developed and evaluated against three different datasets: A pseudo-two-dimensional Newman model generates data at various state of health points. This dataset is fused with experimental data from laboratory measurements and vehicle field data to train a neural network in which it exploits correlation from internal modeled states to the measurable state of health. The resulting physics-informed neural network performs best with the synthetic dataset and achieves a root mean squared error below 2% at estimating the state of health. The root mean squared error stays within 3% for laboratory test data, with the lowest error observed for constant current discharge samples. The physics-informed neural network outperforms several other purely data-driven methods and proves its advantage. The inclusion of physico-chemical information from simulation increases accuracy and further enables broader application ranges.
APA, Harvard, Vancouver, ISO, and other styles
3

Pang, Guofei, Lu Lu, and George Em Karniadakis. "fPINNs: Fractional Physics-Informed Neural Networks." SIAM Journal on Scientific Computing 41, no. 4 (January 2019): A2603—A2626. http://dx.doi.org/10.1137/18m1229845.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Song, Yanjie, He Wang, He Yang, Maria Luisa Taccari, and Xiaohui Chen. "Loss-attentional physics-informed neural networks." Journal of Computational Physics 501 (March 2024): 112781. http://dx.doi.org/10.1016/j.jcp.2024.112781.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rojas, Sergio, Paweł Maczuga, Judit Muñoz-Matute, David Pardo, and Maciej Paszyński. "Robust Variational Physics-Informed Neural Networks." Computer Methods in Applied Mechanics and Engineering 425 (May 2024): 116904. http://dx.doi.org/10.1016/j.cma.2024.116904.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Henkes, Alexander, Henning Wessels, and Rolf Mahnken. "Physics informed neural networks for continuum micromechanics." Computer Methods in Applied Mechanics and Engineering 393 (April 2022): 114790. http://dx.doi.org/10.1016/j.cma.2022.114790.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Haotian, Enno Kätelhön, and Richard G. Compton. "Predicting Voltammetry Using Physics-Informed Neural Networks." Journal of Physical Chemistry Letters 13, no. 2 (January 10, 2022): 536–43. http://dx.doi.org/10.1021/acs.jpclett.1c04054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lee, Sang-Min. "Physics-Informed Neural Networks and its Applications." Journal of the Korea Academia-Industrial cooperation Society 23, no. 12 (December 31, 2022): 755–60. http://dx.doi.org/10.5762/kais.2022.23.12.755.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Son, Hwijae, Jin Woo Jang, Woo Jin Han, and Hyung Ju Hwang. "Sobolev training for physics-informed neural networks." Communications in Mathematical Sciences 21, no. 6 (2023): 1679–705. http://dx.doi.org/10.4310/cms.2023.v21.n6.a11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Omar, Sara Ibrahim, Chen Keasar, Ariel J. Ben-Sasson, and Eldad Haber. "Protein Design Using Physics Informed Neural Networks." Biomolecules 13, no. 3 (March 1, 2023): 457. http://dx.doi.org/10.3390/biom13030457.

Full text
Abstract:
The inverse protein folding problem, also known as protein sequence design, seeks to predict an amino acid sequence that folds into a specific structure and performs a specific function. Recent advancements in machine learning techniques have been successful in generating functional sequences, outperforming previous energy function-based methods. However, these machine learning methods are limited in their interoperability and robustness, especially when designing proteins that must function under non-ambient conditions, such as high temperature, extreme pH, or in various ionic solvents. To address this issue, we propose a new Physics-Informed Neural Networks (PINNs)-based protein sequence design approach. Our approach combines all-atom molecular dynamics simulations, a PINNs MD surrogate model, and a relaxation of binary programming to solve the protein design task while optimizing both energy and the structural stability of proteins. We demonstrate the effectiveness of our design framework in designing proteins that can function under non-ambient conditions.
APA, Harvard, Vancouver, ISO, and other styles
11

Coscia, Dario, Anna Ivagnes, Nicola Demo, and Gianluigi Rozza. "Physics-Informed Neural networks for Advanced modeling." Journal of Open Source Software 8, no. 87 (July 19, 2023): 5352. http://dx.doi.org/10.21105/joss.05352.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Yang, Jianchuan, Xuanqi Liu, Yu Diao, Xi Chen, and Haikuo Hu. "Adaptive task decomposition physics-informed neural networks." Computer Methods in Applied Mechanics and Engineering 418 (January 2024): 116561. http://dx.doi.org/10.1016/j.cma.2023.116561.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Hanna, John M., José V. Aguado, Sebastien Comas-Cardona, Ramzi Askri, and Domenico Borzacchiello. "Sensitivity analysis using Physics-informed neural networks." Engineering Applications of Artificial Intelligence 135 (September 2024): 108764. http://dx.doi.org/10.1016/j.engappai.2024.108764.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Kenzhebek, Y., T. S. Imankulov, and D. Zh Akhmed-Zaki. "PREDICTION OF OIL PRODUCTION USING PHYSICS-INFORMED NEURAL NETWORKS." BULLETIN Series of Physics & Mathematical Sciences 76, no. 4 (December 15, 2021): 45–50. http://dx.doi.org/10.51889/2021-4.1728-7901.06.

Full text
Abstract:
In recent years, modern information technologies have been actively used in various industries. The oil industry is no exception, since high-performance computing technologies, artificial intelligence algorithms, methods of collecting, processing and storing information are actively used to solve the problems of increasing oil recovery. Deep learning has made remarkable strides in a variety of applications, but its use for solving partial differential equations has only recently emerged. In particular, you can replace traditional numerical methods with a neural network that approximates the solution to a partial differential equation. Physically Informed Neural Networks (PINNs) embed partial differential equations into the neural network loss function using automatic differentiation. A numerical algorithm and PINN have been developed for solving the one-dimensional pressure equation from the Buckley-Leverett mathematical model. The results of numerical solution and prediction of the PINN neural network for solving the pressure equation are obtained.
APA, Harvard, Vancouver, ISO, and other styles
15

Oluwasakin, Ebenezer O., and Abdul Q. M. Khaliq. "Optimizing Physics-Informed Neural Network in Dynamic System Simulation and Learning of Parameters." Algorithms 16, no. 12 (November 28, 2023): 547. http://dx.doi.org/10.3390/a16120547.

Full text
Abstract:
Artificial neural networks have changed many fields by giving scientists a strong way to model complex phenomena. They are also becoming increasingly useful for solving various difficult scientific problems. Still, people keep trying to find faster and more accurate ways to simulate dynamic systems. This research explores the transformative capabilities of physics-informed neural networks, a specialized subset of artificial neural networks, in modeling complex dynamical systems with enhanced speed and accuracy. These networks incorporate known physical laws into the learning process, ensuring predictions remain consistent with fundamental principles, which is crucial when dealing with scientific phenomena. This study focuses on optimizing the application of this specialized network for simultaneous system dynamics simulations and learning time-varying parameters, particularly when the number of unknowns in the system matches the number of undetermined parameters. Additionally, we explore scenarios with a mismatch between parameters and equations, optimizing network architecture to enhance convergence speed, computational efficiency, and accuracy in learning the time-varying parameter. Our approach enhances the algorithm’s performance and accuracy, ensuring optimal use of computational resources and yielding more precise results. Extensive experiments are conducted on four different dynamical systems: first-order irreversible chain reactions, biomass transfer, the Brusselsator model, and the Lotka-Volterra model, using synthetically generated data to validate our approach. Additionally, we apply our method to the susceptible-infected-recovered model, utilizing real-world COVID-19 data to learn the time-varying parameters of the pandemic’s spread. A comprehensive comparison between the performance of our approach and fully connected deep neural networks is presented, evaluating both accuracy and computational efficiency in parameter identification and system dynamics capture. The results demonstrate that the physics-informed neural networks outperform fully connected deep neural networks in performance, especially with increased network depth, making them ideal for real-time complex system modeling. This underscores the physics-informed neural network’s effectiveness in scientific modeling in scenarios with balanced unknowns and parameters. Furthermore, it provides a fast, accurate, and efficient alternative for analyzing dynamic systems.
APA, Harvard, Vancouver, ISO, and other styles
16

Rodríguez, Alexander, Jiaming Cui, Naren Ramakrishnan, Bijaya Adhikari, and B. Aditya Prakash. "EINNs: Epidemiologically-Informed Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 12 (June 26, 2023): 14453–60. http://dx.doi.org/10.1609/aaai.v37i12.26690.

Full text
Abstract:
We introduce EINNs, a framework crafted for epidemic forecasting that builds upon the theoretical grounds provided by mechanistic models as well as the data-driven expressibility afforded by AI models, and their capabilities to ingest heterogeneous information. Although neural forecasting models have been successful in multiple tasks, predictions well-correlated with epidemic trends and long-term predictions remain open challenges. Epidemiological ODE models contain mechanisms that can guide us in these two tasks; however, they have limited capability of ingesting data sources and modeling composite signals. Thus, we propose to leverage work in physics-informed neural networks to learn latent epidemic dynamics and transfer relevant knowledge to another neural network which ingests multiple data sources and has more appropriate inductive bias. In contrast with previous work, we do not assume the observability of complete dynamics and do not need to numerically solve the ODE equations during training. Our thorough experiments on all US states and HHS regions for COVID-19 and influenza forecasting showcase the clear benefits of our approach in both short-term and long-term forecasting as well as in learning the mechanistic dynamics over other non-trivial alternatives.
APA, Harvard, Vancouver, ISO, and other styles
17

Karakonstantis, Xenofon, and Efren Fernandez-Grande. "Advancing sound field analysis with physics-informed neural networks." Journal of the Acoustical Society of America 154, no. 4_supplement (October 1, 2023): A98. http://dx.doi.org/10.1121/10.0022920.

Full text
Abstract:
This work introduces a method that employs physics-informed neural networks to reconstruct sound fields in diverse rooms, including both typical acoustically damped meeting rooms and more spaces of cultural significance, such as concert halls or theatres. The neural network is trained using a limited set of room impulse responses, integrating the expressive capacity of neural networks with the fundamental physics of sound propagation governed by the wave equation. Consequently, the network accurately represents sound fields within an aperture without requiring extensive measurements, regardless of the complexity of the sound field. Notably, our approach extends beyond sound pressure estimation and includes valuable vectorial quantities, such as particle velocity and intensity, resembling classical holography methods. Experimental results confirm the efficacy of the proposed approach, underscoring its reconstruction accuracy and computational efficiency. Moreover, by enabling the acquisition of sound field quantities in the time domain, which were previously challenging to obtain from measurements, our method opens up new frontiers for the analysis and comprehension of sound propagation phenomena in rooms.
APA, Harvard, Vancouver, ISO, and other styles
18

Antonion, Klapa, Xiao Wang, Maziar Raissi, and Laurn Joshie. "Machine Learning Through Physics–Informed Neural Networks: Progress and Challenges." Academic Journal of Science and Technology 9, no. 1 (January 20, 2024): 46–49. http://dx.doi.org/10.54097/b1d21816.

Full text
Abstract:
Physics-Informed Neural Networks (PINNs) represent a groundbreaking approach wherein neural networks (NNs) integrate model equations, such as Partial Differential Equations (PDEs), within their architecture. This innovation has become instrumental in solving diverse problem sets including PDEs, fractional equations, integral-differential equations, and stochastic PDEs. It's a versatile multi-task learning framework that tasks NNs with fitting observed data while simultaneously minimizing PDE residuals. This paper delves into the landscape of PINNs, aiming to delineate their inherent strengths and weaknesses. Beyond exploring the fundamental characteristics of these networks, this review endeavors to encompass a wider spectrum of collocation-based physics-informed neural networks, extending beyond the core PINN model. Variants like physics-constrained neural networks (PCNN), variational hp-VPINN, and conservative PINN (CPINN) constitute pivotal aspects of this exploration. The study accentuates a predominant focus in research on tailoring PINNs through diverse strategies: adapting activation functions, refining gradient optimization techniques, innovating neural network structures, and enhancing loss function architectures. Despite the extensive applicability demonstrated by PINNs, surpassing classical numerical methods like Finite Element Method (FEM) in certain contexts, the review highlights ongoing opportunities for advancement. Notably, there are persisting theoretical challenges that demand resolution, ensuring the continued evolution and refinement of this revolutionary approach.
APA, Harvard, Vancouver, ISO, and other styles
19

Hooshyar, Saman, and Arash Elahi. "Sequencing Initial Conditions in Physics-Informed Neural Networks." Journal of Chemistry and Environment 3, no. 1 (March 26, 2024): 98–108. http://dx.doi.org/10.56946/jce.v3i1.345.

Full text
Abstract:
The scientific machine learning (SciML) field has introduced a new class of models called physics-informed neural networks (PINNs). These models incorporate domain-specific knowledge as soft constraints on a loss function and use machine learning techniques to train the model. Although PINN models have shown promising results for simple problems, they are prone to failure when moderate level of complexities are added to the problems. We demonstrate that the existing baseline models, in particular PINN and evolutionary sampling (Evo), are unable to capture the solution to differential equations with convection, reaction, and diffusion operators when the imposed initial condition is non-trivial. We then propose a promising solution to address these types of failure modes. This approach involves coupling Curriculum learning with the baseline models, where the network first trains on PDEs with simple initial conditions and is progressively exposed to more complex initial conditions. Our results show that we can reduce the error by 1 – 2 orders of magnitude with our proposed method compared to regular PINN and Evo.
APA, Harvard, Vancouver, ISO, and other styles
20

Pu, Ruilong, and Xinlong Feng. "Physics-Informed Neural Networks for Solving Coupled Stokes–Darcy Equation." Entropy 24, no. 8 (August 11, 2022): 1106. http://dx.doi.org/10.3390/e24081106.

Full text
Abstract:
In this paper, a grid-free deep learning method based on a physics-informed neural network is proposed for solving coupled Stokes–Darcy equations with Bever–Joseph–Saffman interface conditions. This method has the advantage of avoiding grid generation and can greatly reduce the amount of computation when solving complex problems. Although original physical neural network algorithms have been used to solve many differential equations, we find that the direct use of physical neural networks to solve coupled Stokes–Darcy equations does not provide accurate solutions in some cases, such as rigid terms due to small parameters and interface discontinuity problems. In order to improve the approximation ability of a physics-informed neural network, we propose a loss-function-weighted function strategy, a parallel network structure strategy, and a local adaptive activation function strategy. In addition, the physical information neural network with an added strategy provides inspiration for solving other more complicated problems of multi-physical field coupling. Finally, the effectiveness of the proposed strategy is verified by numerical experiments.
APA, Harvard, Vancouver, ISO, and other styles
21

Zhai, Hanfeng, Quan Zhou, and Guohui Hu. "Predicting micro-bubble dynamics with semi-physics-informed deep learning." AIP Advances 12, no. 3 (March 1, 2022): 035153. http://dx.doi.org/10.1063/5.0079602.

Full text
Abstract:
Utilizing physical information to improve the performance of the conventional neural networks is becoming a promising research direction in scientific computing recently. For multiphase flows, it would require significant computational resources for neural network training due to the large gradients near the interface between the two fluids. Based on the idea of the physics-informed neural networks (PINNs), a modified deep learning framework BubbleNet is proposed to overcome this difficulty in the present study. The deep neural network (DNN) with separate sub-nets is adopted to predict physics fields, with the semi-physics-informed part encoding the continuity equation and the pressure Poisson equation [Formula: see text] for supervision and the time discretized normalizer to normalize field data per time step before training. Two bubbly flows, i.e., single bubble flow and multiple bubble flow in a microchannel, are considered to test the algorithm. The conventional computational fluid dynamics software is applied to obtain the training dataset. The traditional DNN and the BubbleNet(s) are utilized to train the neural network and predict the flow fields for the two bubbly flows. Results indicate the BubbleNet frameworks are able to successfully predict the physics fields, and the inclusion of the continuity equation significantly improves the performance of deep NNs. The introduction of the Poisson equation also has slightly positive effects on the prediction results. The results suggest that constructing semi-PINNs by flexibly considering the physical information into neural networks will be helpful in the learning of complex flow problems.
APA, Harvard, Vancouver, ISO, and other styles
22

Hassanaly, Malik, Peter J. Weddle, Corey R. Randall, Eric J. Dufek, and Kandler Smith. "Rapid Inverse Parameter Inference Using Physics-Informed Neural Networks." ECS Meeting Abstracts MA2024-01, no. 2 (August 9, 2024): 345. http://dx.doi.org/10.1149/ma2024-012345mtgabs.

Full text
Abstract:
As Li-ion batteries become more essential in today's economy, tools need to be developed to accurately and rapidly diagnose a battery's internal state-of-health. Using a Li-ion battery's (high-rate) voltage response, it is proposed to determine a battery's internal state through Bayesian calibration. However, Bayesian calibration is notoriously slow and requires thousands of model runs. To accelerate parameter inference using Bayesian calibration, a surrogate model is developed to replace the underlying physics-based Li-ion model. Developing a surrogate model for rapid Bayesian calibration analysis is discussed for both the single particle model (SPM) and the pseudo two-dimensional (P2D) model. Surrogate models are constructed using physics-informed neural networks (PINNs) that encode the influence of internal properties on observed voltage responses. In practice, a neural network can be trained by: 1) using simulation results of the physics-based model (i.e., a data-loss approach); 2) using the residuals of the governing equations themselves (i.e., a physics-loss approach); or 3) using a combination of simulation results and governing equation residuals. In the present work, PINNs are developed using a variety of training losses and neural network architectures. In this analysis, it is shown that a PINN surrogate model can be reliably trained with only physics-informed loss. However, using a coupled data-informed and physics-loss approach produced the most accurate PINNs. Figure~\ref{fig:spm_2d} illustrates the absolute relative errors of trained PINN networks using several different training losses and neural network architectures. After determining a consistent training strategy for both the SPM and P2D PINN surrogate models, the PINNs are extended to determine additional internal state-of-health parameters. As more and more parameters were introduced, the PINN training suffered from ``the curse of dimensionality", which was mitigated by using a hierarchical training approach (where a PINN trained with fewer variable model parameters was used to train a PINN with more variable model parameters). Next, the high-dimensionality PINN surrogates are then integrated into Bayesian calibration schemes to identify internal Li-ion battery properties from experimentally measured voltages. Interpreting the high-dimensional parameter posteriors is discussed with respect to model error, parameter prior choices, and experimental errors. Figure 1
APA, Harvard, Vancouver, ISO, and other styles
23

Hall, Eric J., Søren Taverniers, Markos A. Katsoulakis, and Daniel M. Tartakovsky. "GINNs: Graph-Informed Neural Networks for multiscale physics." Journal of Computational Physics 433 (May 2021): 110192. http://dx.doi.org/10.1016/j.jcp.2021.110192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Mishra, Siddhartha, and Roberto Molinaro. "Physics informed neural networks for simulating radiative transfer." Journal of Quantitative Spectroscopy and Radiative Transfer 270 (August 2021): 107705. http://dx.doi.org/10.1016/j.jqsrt.2021.107705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Waheed, Umair bin, Ehsan Haghighat, Tariq Alkhalifah, Chao Song, and Qi Hao. "PINNeik: Eikonal solution using physics-informed neural networks." Computers & Geosciences 155 (October 2021): 104833. http://dx.doi.org/10.1016/j.cageo.2021.104833.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Song, Chao, and Tariq A. Alkhalifah. "Wavefield Reconstruction Inversion via Physics-Informed Neural Networks." IEEE Transactions on Geoscience and Remote Sensing 60 (2022): 1–12. http://dx.doi.org/10.1109/tgrs.2021.3123122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Shukla, Khemraj, Ameya D. Jagtap, and George Em Karniadakis. "Parallel physics-informed neural networks via domain decomposition." Journal of Computational Physics 447 (December 2021): 110683. http://dx.doi.org/10.1016/j.jcp.2021.110683.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Kovacs, Alexander, Lukas Exl, Alexander Kornell, Johann Fischbacher, Markus Hovorka, Markus Gusenbauer, Leoni Breth, et al. "Magnetostatics and micromagnetics with physics informed neural networks." Journal of Magnetism and Magnetic Materials 548 (April 2022): 168951. http://dx.doi.org/10.1016/j.jmmm.2021.168951.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Penwarden, Michael, Shandian Zhe, Akil Narayan, and Robert M. Kirby. "Multifidelity modeling for Physics-Informed Neural Networks (PINNs)." Journal of Computational Physics 451 (February 2022): 110844. http://dx.doi.org/10.1016/j.jcp.2021.110844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Bolderman, M., D. Fan, M. Lazar, and H. Butler. "Generalized feedforward control using physics—informed neural networks." IFAC-PapersOnLine 55, no. 16 (2022): 148–53. http://dx.doi.org/10.1016/j.ifacol.2022.09.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Yang, Yibo, and Paris Perdikaris. "Adversarial uncertainty quantification in physics-informed neural networks." Journal of Computational Physics 394 (October 2019): 136–52. http://dx.doi.org/10.1016/j.jcp.2019.05.027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Mao, Zhiping, Ameya D. Jagtap, and George Em Karniadakis. "Physics-informed neural networks for high-speed flows." Computer Methods in Applied Mechanics and Engineering 360 (March 2020): 112789. http://dx.doi.org/10.1016/j.cma.2019.112789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Jin, Ge, Jian Cheng Wong, Abhishek Gupta, Shipeng Li, and Yew-Soon Ong. "Fourier warm start for physics-informed neural networks." Engineering Applications of Artificial Intelligence 132 (June 2024): 107887. http://dx.doi.org/10.1016/j.engappai.2024.107887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Anagnostopoulos, Sokratis J., Juan Diego Toscano, Nikolaos Stergiopulos, and George Em Karniadakis. "Residual-based attention in physics-informed neural networks." Computer Methods in Applied Mechanics and Engineering 421 (March 2024): 116805. http://dx.doi.org/10.1016/j.cma.2024.116805.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Wang, Sifan, Shyam Sankaran, and Paris Perdikaris. "Respecting causality for training physics-informed neural networks." Computer Methods in Applied Mechanics and Engineering 421 (March 2024): 116813. http://dx.doi.org/10.1016/j.cma.2024.116813.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Liu, Ziti, Yang Liu, Xunshi Yan, Wen Liu, Shuaiqi Guo, and Chen-an Zhang. "AsPINN: Adaptive symmetry-recomposition physics-informed neural networks." Computer Methods in Applied Mechanics and Engineering 432 (December 2024): 117405. http://dx.doi.org/10.1016/j.cma.2024.117405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Hou, Shubo, Wenchao Wu, and Xiuhong Hao. "Physics-informed neural network for simulating magnetic field of permanent magnet." Journal of Physics: Conference Series 2853, no. 1 (October 1, 2024): 012018. http://dx.doi.org/10.1088/1742-6596/2853/1/012018.

Full text
Abstract:
Abstract With the rapid development of deep learning, its application in physical field simulation has been widely concerned, and it has begun to lead a new model of meshless simulation. In this paper, research based on physics-informed neural networks is carried out to solve partial differential equations related to the physical laws of electromagnetism. Then the magnetic field simulation is realized. In this method, the governing equation and the boundary conditions containing physical information are embedded into the neural network loss function as constraints, and the backpropagation of neural networks is realized based on automatic differentiation to solve partial differential equations. The high-precision simulation of tile-shaped and rectangular permanent magnet magnetic fields of permanent magnet motors based on physical information neural network is studied, and the error is within 5%. We consider the simulation of magnetic field in two coordinate systems, and realize the joint training of multiple neural networks in multiple sub-domains and different media.
APA, Harvard, Vancouver, ISO, and other styles
38

Lee, Brandon M., and David R. Dowling. "Training physics-informed neural networks to directly predict acoustic field values in simple environments." Journal of the Acoustical Society of America 152, no. 4 (October 2022): A49. http://dx.doi.org/10.1121/10.0015499.

Full text
Abstract:
As acousticians turn to machine learning for solutions to old and new problems, neural networks have become a go-to tool due to their capacity for model representation and quick forward computations. However, these benefits come at the cost of obscurity; it is difficult to determine whether the proficiency of a trained neural network is limited by training effort, training dataset size or scope, or compatibility of the network’s design with the data’s underlying pattern of interest. For neural networks trained to provide solutions to the point-source Helmholtz-equation in axisymmetric single-path, two-path, and multi-path (ideal waveguide) environments with constant sound speed, the key limitations are the dataset composition and network design. This study examines the effects on performance and explainablity which result from providing physical information (governing equation and boundary conditions) to these neural networks, instead of only acoustic-field solutions generated from well-known analytic solutions. The outcome of using physics-informed neural networks (PINNs) for these simple environments informs their possible extension to more complex, realistic environments. This study emphasizes source frequencies in the 100’s of Hz, depths up to 500 m, and ranges up to 10 km for sound speeds near 1500 m/s. [Work supported by the NDSEG fellowship program.]
APA, Harvard, Vancouver, ISO, and other styles
39

Stenkin, Dmitry, and Vladimir Gorbachenko. "Mathematical Modeling on a Physics-Informed Radial Basis Function Network." Mathematics 12, no. 2 (January 11, 2024): 241. http://dx.doi.org/10.3390/math12020241.

Full text
Abstract:
The article is devoted to approximate methods for solving differential equations. An approach based on neural networks with radial basis functions is presented. Neural network training algorithms adapted to radial basis function networks are proposed, in particular adaptations of the Nesterov and Levenberg-Marquardt algorithms. The effectiveness of the proposed algorithms is demonstrated for solving model problems of function approximation, differential equations, direct and inverse boundary value problems, and modeling processes in piecewise homogeneous media.
APA, Harvard, Vancouver, ISO, and other styles
40

Leontiou, Theodoros, Anna Frixou, Marios Charalambides, Efstathios Stiliaris, Costas N. Papanicolas, Sofia Nikolaidou, and Antonis Papadakis. "Three-Dimensional Thermal Tomography with Physics-Informed Neural Networks." Tomography 10, no. 12 (November 30, 2024): 1930–46. https://doi.org/10.3390/tomography10120140.

Full text
Abstract:
Background: Accurate reconstruction of internal temperature fields from surface temperature data is critical for applications such as non-invasive thermal imaging, particularly in scenarios involving small temperature gradients, like those in the human body. Methods: In this study, we employed 3D convolutional neural networks (CNNs) to predict internal temperature fields. The network’s performance was evaluated under both ideal and non-ideal conditions, incorporating noise and background temperature variations. A physics-informed loss function embedding the heat equation was used in conjunction with statistical uncertainty during training to simulate realistic scenarios. Results: The CNN achieved high accuracy for small phantoms (e.g., 10 cm in diameter). However, under non-ideal conditions, the network’s predictive capacity diminished in larger domains, particularly in regions distant from the surface. The introduction of physical constraints in the training processes improved the model’s robustness in noisy environments, enabling accurate reconstruction of hot-spots in deeper regions where traditional CNNs struggled. Conclusions: Combining deep learning with physical constraints provides a robust framework for non-invasive thermal imaging and other applications requiring high-precision temperature field reconstruction, particularly under non-ideal conditions.
APA, Harvard, Vancouver, ISO, and other styles
41

Wang, Jing, Yubo Li, Anping Wu, Zheng Chen, Jun Huang, Qingfeng Wang, and Feng Liu. "Multi-Step Physics-Informed Deep Operator Neural Network for Directly Solving Partial Differential Equations." Applied Sciences 14, no. 13 (June 25, 2024): 5490. http://dx.doi.org/10.3390/app14135490.

Full text
Abstract:
This paper establishes a method for solving partial differential equations using a multi-step physics-informed deep operator neural network. The network is trained by embedding physics-informed constraints. Different from traditional neural networks for solving partial differential equations, the proposed method uses a deep neural operator network to indirectly construct the mapping relationship between the variable functions and solution functions. This approach makes full use of the hidden information between the variable functions and independent variables. The process whereby the model captures incredibly complex and highly nonlinear relationships is simplified, thereby making network learning easier and enhancing the extraction of information about the independent variables in partial differential systems. In terms of solving partial differential equations, we verify that the multi-step physics-informed deep operator neural network markedly improves the solution accuracy compared with a traditional physics-informed deep neural operator network, especially when the problem involves complex physical phenomena with large gradient changes.
APA, Harvard, Vancouver, ISO, and other styles
42

Karakonstantis, Xenofon, Diego Caviedes-Nozal, Antoine Richard, and Efren Fernandez-Grande. "Room impulse response reconstruction with physics-informed deep learning." Journal of the Acoustical Society of America 155, no. 2 (February 1, 2024): 1048–59. http://dx.doi.org/10.1121/10.0024750.

Full text
Abstract:
A method is presented for estimating and reconstructing the sound field within a room using physics-informed neural networks. By incorporating a limited set of experimental room impulse responses as training data, this approach combines neural network processing capabilities with the underlying physics of sound propagation, as articulated by the wave equation. The network's ability to estimate particle velocity and intensity, in addition to sound pressure, demonstrates its capacity to represent the flow of acoustic energy and completely characterise the sound field with only a few measurements. Additionally, an investigation into the potential of this network as a tool for improving acoustic simulations is conducted. This is due to its proficiency in offering grid-free sound field mappings with minimal inference time. Furthermore, a study is carried out which encompasses comparative analyses against current approaches for sound field reconstruction. Specifically, the proposed approach is evaluated against both data-driven techniques and elementary wave-based regression methods. The results demonstrate that the physics-informed neural network stands out when reconstructing the early part of the room impulse response, while simultaneously allowing for complete sound field characterisation in the time domain.
APA, Harvard, Vancouver, ISO, and other styles
43

Schmid, Johannes. "Physics-informed neural networks for solving the Helmholtz equation." INTER-NOISE and NOISE-CON Congress and Conference Proceedings 267, no. 1 (November 5, 2023): 265–68. http://dx.doi.org/10.3397/no_2023_0049.

Full text
Abstract:
Discretization-based methods like the finite element method have proven to be effective for solving the Helmholtz equation in computational acoustics. However, it is very challenging to incorporate measured data into the model or infer model input parameters based on observed response data. Machine learning approaches have shown promising potential in data-driven modeling. In practical applications, purely supervised approaches suffer from poor generalization and physical interpretability. Physics-informed neural networks (PINNs) incorporate prior knowledge of the underlying partial differential equation by including the residual into the loss function of an artificial neural network. Training the neural network minimizes the residual of both the differential equation and the boundary conditions and learns a solution that satisfies the corresponding boundary value problem. In this contribution, PINNs are applied to solve the Helmholtz equation within a two-dimensional acoustic duct and mixed boundary conditions are considered. The results show that PINNs are able to solve the Helmholtz equation very accurately and provide a promising data-driven method for physics-based surrogate modeling.
APA, Harvard, Vancouver, ISO, and other styles
44

Schmid, Johannes D., Philipp Bauerschmidt, Caglar Gurbuz, and Steffen Marburg. "Physics-informed neural networks for characterization of structural dynamic boundary conditions." Journal of the Acoustical Society of America 154, no. 4_supplement (October 1, 2023): A99. http://dx.doi.org/10.1121/10.0022923.

Full text
Abstract:
Structural dynamics simulations are often faced with challenges arising from unknown boundary conditions, leading to considerable prediction uncertainties. Direct measurement of these boundary conditions can be impractical for certain mounting scenarios, such as joints or screw connections. In addition, conventional inverse methods face limitations in integrating measured data and solving inverse problems when the forward model is computationally expensive. In this study, we explore the potential of physics-informed neural networks that incorporate the residual of a partial differential equation into the loss function of a neural network to ensure physically consistent predictions. We train the neural network using noisy boundary displacement data of a structure from a finite element reference solution. The network learns to predict the displacement field within the structure while satisfying the Navier–Lamé equations in the frequency domain. Our results show that physics-informed neural networks accurately predict the displacement field within a three-dimensional structure using only boundary training data. Additionally, differentiating the trained network allows precise characterization of previously unknown boundary conditions and facilitates the assessment of non-measurable quantities, such as the stress tensor.
APA, Harvard, Vancouver, ISO, and other styles
45

Farea, Amer, Olli Yli-Harja, and Frank Emmert-Streib. "Understanding Physics-Informed Neural Networks: Techniques, Applications, Trends, and Challenges." AI 5, no. 3 (August 29, 2024): 1534–57. http://dx.doi.org/10.3390/ai5030074.

Full text
Abstract:
Physics-informed neural networks (PINNs) represent a significant advancement at the intersection of machine learning and physical sciences, offering a powerful framework for solving complex problems governed by physical laws. This survey provides a comprehensive review of the current state of research on PINNs, highlighting their unique methodologies, applications, challenges, and future directions. We begin by introducing the fundamental concepts underlying neural networks and the motivation for integrating physics-based constraints. We then explore various PINN architectures and techniques for incorporating physical laws into neural network training, including approaches to solving partial differential equations (PDEs) and ordinary differential equations (ODEs). Additionally, we discuss the primary challenges faced in developing and applying PINNs, such as computational complexity, data scarcity, and the integration of complex physical laws. Finally, we identify promising future research directions. Overall, this survey seeks to provide a foundational understanding of PINNs within this rapidly evolving field.
APA, Harvard, Vancouver, ISO, and other styles
46

Kovacs, Alexander, Lukas Exl, Alexander Kornell, Johann Fischbacher, Markus Hovorka, Markus Gusenbauer, Leoni Breth, et al. "Conditional physics informed neural networks." Communications in Nonlinear Science and Numerical Simulation, September 2021, 106041. http://dx.doi.org/10.1016/j.cnsns.2021.106041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Berrone, S., C. Canuto, M. Pintore, and N. Sukumar. "Enforcing Dirichlet boundary conditions in physics-informed neural networks and variational physics-informed neural networks." Heliyon, August 2023, e18820. http://dx.doi.org/10.1016/j.heliyon.2023.e18820.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

McClenny, Levi, and Ulisses Braga-Neto. "Self-Adaptive Physics-Informed Neural Networks." SSRN Electronic Journal, 2022. http://dx.doi.org/10.2139/ssrn.4086448.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

McClenny, Levi D., and Ulisses Braga-Neto. "Self-adaptive physics-informed neural networks." Journal of Computational Physics, November 2022, 111722. http://dx.doi.org/10.1016/j.jcp.2022.111722.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Dourado, Arinan, and Felipe A. C. Viana. "Physics-Informed Neural Networks for Corrosion-Fatigue Prognosis." Annual Conference of the PHM Society 11, no. 1 (September 22, 2019). http://dx.doi.org/10.36001/phmconf.2019.v11i1.814.

Full text
Abstract:
In this paper, we present a novel physics-informed neural network modeling approach for corrosion-fatigue. The hybrid approach is designed to merge physics- informed and data-driven layers within deep neural networks. The result is a cumulative damage model where the physics-informed layers are used to model the relatively well understood physics (crack growth through Paris law) and the data-driven layers account for the hard to model effects (bias in damage accumulation due to corrosion). A numerical experiment is used to present the main features of the proposed physics-informed recurrent neural network for damage accumulation. The test problem consists of predicting corrosion-fatigue of an Al 2024-T3 alloy used on panels of aircraft wing. Besides cyclic loading, the panels are also subjected to saline corrosion. The physics-informed neural network is trained using full observation of inputs (far-field loads, stress ratio and a corrosivity index – defined per airport) and very limited observation of outputs (crack length at inspection for only a small portion of the fleet). Results show that the physics-informed neural network is able to learn the correction in the original fatigue model due to corrosion and predictions are accurate enough for ranking damage in different airplanes in the fleet (which can be used to prioritizing inspection).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography