To see the other types of publications on this topic, follow the link: Neural fields equations.

Journal articles on the topic 'Neural fields equations'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Neural fields equations.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Veltz, Romain, and Olivier Faugeras. "A Center Manifold Result for Delayed Neural Fields Equations." SIAM Journal on Mathematical Analysis 45, no. 3 (2013): 1527–62. http://dx.doi.org/10.1137/110856162.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Belhe, Yash, Michaël Gharbi, Matthew Fisher, Iliyan Georgiev, Ravi Ramamoorthi, and Tzu-Mao Li. "Discontinuity-Aware 2D Neural Fields." ACM Transactions on Graphics 42, no. 6 (2023): 1–11. http://dx.doi.org/10.1145/3618379.

Full text
Abstract:
Neural image representations offer the possibility of high fidelity, compact storage, and resolution-independent accuracy, providing an attractive alternative to traditional pixel- and grid-based representations. However, coordinate neural networks fail to capture discontinuities present in the image and tend to blur across them; we aim to address this challenge. In many cases, such as rendered images, vector graphics, diffusion curves, or solutions to partial differential equations, the locations of the discontinuities are known. We take those locations as input, represented as linear, quadra
APA, Harvard, Vancouver, ISO, and other styles
3

Scheinker, Alexander, and Reeju Pokharel. "Physics-constrained 3D convolutional neural networks for electrodynamics." APL Machine Learning 1, no. 2 (2023): 026109. http://dx.doi.org/10.1063/5.0132433.

Full text
Abstract:
We present a physics-constrained neural network (PCNN) approach to solving Maxwell’s equations for the electromagnetic fields of intense relativistic charged particle beams. We create a 3D convolutional PCNN to map time-varying current and charge densities J(r, t) and ρ(r, t) to vector and scalar potentials A(r, t) and φ(r, t) from which we generate electromagnetic fields according to Maxwell’s equations: B = ∇ × A and E = −∇ φ − ∂A/ ∂t. Our PCNNs satisfy hard constraints, such as ∇ · B = 0, by construction. Soft constraints push A and φ toward satisfying the Lorenz gauge.
APA, Harvard, Vancouver, ISO, and other styles
4

Nicks, Rachel, Abigail Cocks, Daniele Avitabile, Alan Johnston, and Stephen Coombes. "Understanding Sensory Induced Hallucinations: From Neural Fields to Amplitude Equations." SIAM Journal on Applied Dynamical Systems 20, no. 4 (2021): 1683–714. http://dx.doi.org/10.1137/20m1366885.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Veltz, Romain, and Olivier Faugeras. "ERRATUM: A Center Manifold Result for Delayed Neural Fields Equations." SIAM Journal on Mathematical Analysis 47, no. 2 (2015): 1665–70. http://dx.doi.org/10.1137/140962279.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bressloff, Paul C., and Zachary P. Kilpatrick. "Nonlinear Langevin Equations for Wandering Patterns in Stochastic Neural Fields." SIAM Journal on Applied Dynamical Systems 14, no. 1 (2015): 305–34. http://dx.doi.org/10.1137/140990371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sim, Fabio M., Eka Budiarto, and Rusman Rusyadi. "Comparison and Analysis of Neural Solver Methods for Differential Equations in Physical Systems." ELKHA 13, no. 2 (2021): 134. http://dx.doi.org/10.26418/elkha.v13i2.49097.

Full text
Abstract:
Differential equations are ubiquitous in many fields of study, yet not all equations, whether ordinary or partial, can be solved analytically. Traditional numerical methods such as time-stepping schemes have been devised to approximate these solutions. With the advent of modern deep learning, neural networks have become a viable alternative to traditional numerical methods. By reformulating the problem as an optimisation task, neural networks can be trained in a semi-supervised learning fashion to approximate nonlinear solutions. In this paper, neural solvers are implemented in TensorFlow for
APA, Harvard, Vancouver, ISO, and other styles
8

Dong, Chenghao. "Solving Differential Equations with Physics-Informed Neural Networks." Theoretical and Natural Science 87, no. 1 (2025): 137–46. https://doi.org/10.54254/2753-8818/2025.20346.

Full text
Abstract:
Solving differential equations is an extensive topic in various fields, such as fluid mechanics and mathematical finance. The recent resurgence in deep neural networks has opened up a brand new track for numerically solving these equations, with the potential to better deal with nonlinear problems and overcome the curse of dimensionality. The Physics-Informed Neural Network (PINN) is one of the fundamental attempts to solve differential equations using deep learning techniques. This paper aims to briefly review the application of PINNs and their variants in solving differential equations throu
APA, Harvard, Vancouver, ISO, and other styles
9

ITOH, MAKOTO, and LEON O. CHUA. "IMAGE PROCESSING AND SELF-ORGANIZING CNN." International Journal of Bifurcation and Chaos 15, no. 09 (2005): 2939–58. http://dx.doi.org/10.1142/s0218127405013794.

Full text
Abstract:
CNN templates for image processing and pattern formation are derived from neural field equations, advection equations and reaction–diffusion equations by discretizing spatial integrals and derivatives. Many useful CNN templates are derived by this approach. Furthermore, self-organization is investigated from the viewpoint of divergence of vector fields.
APA, Harvard, Vancouver, ISO, and other styles
10

Park, Yongsung, Seunghyun Yoon, Peter Gerstoft, and Woojae Seong. "Physics-informed neural network-based predictions of ocean acoustic pressure fields." Journal of the Acoustical Society of America 155, no. 3_Supplement (2024): A44. http://dx.doi.org/10.1121/10.0026740.

Full text
Abstract:
Physics-informed neural network (PINN) trains the network using sampled data and encodes the underlying physical laws governing the dataset, such as partial differential equations (PDEs). A trained PINN can predict data at locations beyond the sampled data positions. The ocean acoustic pressure field satisfies PDEs, Helmholtz equations. We present a method utilizing PINN for predicting the underwater acoustic pressure field. Our approach trains the network by fitting sampled data, embedding PDEs, and enforcing pressure-release surface boundary conditions. We demonstrate our approach under vari
APA, Harvard, Vancouver, ISO, and other styles
11

Wennekers, Thomas. "Dynamic Approximation of Spatiotemporal Receptive Fields in Nonlinear Neural Field Models." Neural Computation 14, no. 8 (2002): 1801–25. http://dx.doi.org/10.1162/089976602760128027.

Full text
Abstract:
This article presents an approximation method to reduce the spatiotemporal behavior of localized activation peaks (also called “bumps”) in nonlinear neural field equations to a set of coupled ordinary differential equations (ODEs) for only the amplitudes and tuning widths of these peaks. This enables a simplified analysis of steady-state receptive fields and their stability, as well as spatiotemporal point spread functions and dynamic tuning properties. A lowest-order approximation for peak amplitudes alone shows that much of the well-studied behavior of small neural systems (e.g., the Wilson-
APA, Harvard, Vancouver, ISO, and other styles
12

Mentzer, Katherine L., and J. Luc Peterson. "Neural network surrogate models for equations of state." Physics of Plasmas 30, no. 3 (2023): 032704. http://dx.doi.org/10.1063/5.0126708.

Full text
Abstract:
Equation of state (EOS) data provide necessary information for accurate multiphysics modeling, which is necessary for fields such as inertial confinement fusion. Here, we suggest a neural network surrogate model of energy and entropy and use thermodynamic relationships to derive other necessary thermodynamic EOS quantities. We incorporate phase information into the model by training a phase classifier and using phase-specific regression models, which improves the modal prediction accuracy. Our model predicts energy values to 1% relative error and entropy to 3.5% relative error in a log-transfo
APA, Harvard, Vancouver, ISO, and other styles
13

Samia Atallah. "The Numerical Methods of Fractional Differential Equations." مجلة جامعة بني وليد للعلوم الإنسانية والتطبيقية 8, no. 4 (2023): 496–512. http://dx.doi.org/10.58916/jhas.v8i4.44.

Full text
Abstract:
Differential equations with non-integer order derivatives have demonstrated are suitable models for a variety of physical events in several fields including diffusion processes and damping laws, fluid mechanics neural networks. In this study, i will discuss two numerical methods Diethelm's method and Adams-Bashforth-Moulton method for solving fractional ordinary differential equations (ODEs) with initial conditions.
APA, Harvard, Vancouver, ISO, and other styles
14

Galaburdin, A. V. "Application of Neural Networks for Solving Elliptic Equations in Domains with Complex Geometries." Computational Mathematics and Information Technologies 9, no. 2 (2025): 44–51. https://doi.org/10.23947/2587-8999-2025-9-2-44-51.

Full text
Abstract:
Introduction. Differential equations are often used in modelling across various fields of science and engineering. Recently, neural networks have been increasingly applied to solve differential equations. This paper proposes an original method for constructing a neural network to solve elliptic differential equations. The method is used for solving boundary value problems in domains with complex geometric shapes.Materials and Methods. A method is proposed for constructing a neural network designed to solve partial differential equations of the elliptic type. By applying a transformation of the
APA, Harvard, Vancouver, ISO, and other styles
15

Soumaya, Nouna, Nouna Assia, Mansouri Mohamed, Tammouch Ilyas, and Achchab Boujamaa. "Two-dimensional Klein-Gordon and Sine-Gordon numerical solutions based on deep neural network." IAES International Journal of Artificial Intelligence (IJ-AI) 14, no. 2 (2025): 1548–60. https://doi.org/10.11591/ijai.v14.i2.pp1548-1560.

Full text
Abstract:
Due to the well-known dimensionality curse, developing effective numerical techniques to resolve partial differential equations proved a complex problem. Wepropose a deep learning technique for solving these problems. Feedforward neural networks (FNNs) use to approximate a partial differential equation with morerobust and weaker boundaries and initial conditions. The framework called PyDEns could handle calculation fields that are not regular. Numerical exper iments on two-dimensional Sine-Gordon and Klein-Gordon systems show the provided frameworks to be sufficiently accurate.
APA, Harvard, Vancouver, ISO, and other styles
16

Chu, Mengyu, Lingjie Liu, Quan Zheng, et al. "Physics informed neural fields for smoke reconstruction with sparse data." ACM Transactions on Graphics 41, no. 4 (2022): 1–14. http://dx.doi.org/10.1145/3528223.3530169.

Full text
Abstract:
High-fidelity reconstruction of dynamic fluids from sparse multiview RGB videos remains a formidable challenge, due to the complexity of the underlying physics as well as the severe occlusion and complex lighting in the captured data. Existing solutions either assume knowledge of obstacles and lighting, or only focus on simple fluid scenes without obstacles or complex lighting, and thus are unsuitable for real-world scenes with unknown lighting conditions or arbitrary obstacles. We present the first method to reconstruct dynamic fluid phenomena by leveraging the governing physics (ie, Navier -
APA, Harvard, Vancouver, ISO, and other styles
17

Guo, Yanan, Xiaoqun Cao, Bainian Liu, and Mei Gao. "Solving Partial Differential Equations Using Deep Learning and Physical Constraints." Applied Sciences 10, no. 17 (2020): 5917. http://dx.doi.org/10.3390/app10175917.

Full text
Abstract:
The various studies of partial differential equations (PDEs) are hot topics of mathematical research. Among them, solving PDEs is a very important and difficult task. Since many partial differential equations do not have analytical solutions, numerical methods are widely used to solve PDEs. Although numerical methods have been widely used with good performance, researchers are still searching for new methods for solving partial differential equations. In recent years, deep learning has achieved great success in many fields, such as image classification and natural language processing. Studies
APA, Harvard, Vancouver, ISO, and other styles
18

Ren, Zijie. "Advancements of Exploiting Convolutional Neural Networks for Solving Differential Equations." Applied and Computational Engineering 94, no. 1 (2024): 190–96. http://dx.doi.org/10.54254/2755-2721/94/2024melb0070.

Full text
Abstract:
Solving Partial Differential Equations (PDEs) is essential across various fields, including physics, mathematics, and engineering. This study explores innovative methods for solving PDEs using Convolutional Neural Networks (CNNs). Traditionally, solving PDEs through numerical methods like finite difference or spectral approaches is computationally intensive, particularly when addressing high-dimensional problems and complex boundary conditions. CNNs, with their ability to handle spatial data efficiently, offer a promising alternative. This research evaluates different neural network architectu
APA, Harvard, Vancouver, ISO, and other styles
19

Burlakov, Evgenii, Anna Oleynik, and Arcady Ponosov. "Travelling Waves in Neural Fields with Continuous and Discontinuous Neuronal Activation." Mathematics 13, no. 5 (2025): 701. https://doi.org/10.3390/math13050701.

Full text
Abstract:
The main object of our study is travelling waves in vast neuronal ensembles modelled using neural field equations. We obtained conditions that guarantee the existence of travelling wave solutions and their continuous dependence under the transition from sigmoidal neuronal activation functions to the Heaviside activation function. We, thus, filled the gap between the continuous and the discontinuous approaches to the formalization of the neuronal activation process in studies of travelling waves. We provided conditions for admissibility to operate with simple closed-form expressions for travell
APA, Harvard, Vancouver, ISO, and other styles
20

Raissi, Maziar, Alireza Yazdani, and George Em Karniadakis. "Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations." Science 367, no. 6481 (2020): 1026–30. http://dx.doi.org/10.1126/science.aaw4741.

Full text
Abstract:
For centuries, flow visualization has been the art of making fluid motion visible in physical and biological systems. Although such flow patterns can be, in principle, described by the Navier-Stokes equations, extracting the velocity and pressure fields directly from the images is challenging. We addressed this problem by developing hidden fluid mechanics (HFM), a physics-informed deep-learning framework capable of encoding the Navier-Stokes equations into the neural networks while being agnostic to the geometry or the initial and boundary conditions. We demonstrate HFM for several physical an
APA, Harvard, Vancouver, ISO, and other styles
21

Hou, Shubo, Wenchao Wu, and Xiuhong Hao. "Physics-informed neural network for simulating magnetic field of permanent magnet." Journal of Physics: Conference Series 2853, no. 1 (2024): 012018. http://dx.doi.org/10.1088/1742-6596/2853/1/012018.

Full text
Abstract:
Abstract With the rapid development of deep learning, its application in physical field simulation has been widely concerned, and it has begun to lead a new model of meshless simulation. In this paper, research based on physics-informed neural networks is carried out to solve partial differential equations related to the physical laws of electromagnetism. Then the magnetic field simulation is realized. In this method, the governing equation and the boundary conditions containing physical information are embedded into the neural network loss function as constraints, and the backpropagation of n
APA, Harvard, Vancouver, ISO, and other styles
22

Kwessi, Eddy. "A Consistent Estimator of Nontrivial Stationary Solutions of Dynamic Neural Fields." Stats 4, no. 1 (2021): 122–37. http://dx.doi.org/10.3390/stats4010010.

Full text
Abstract:
Dynamics of neural fields are tools used in neurosciences to understand the activities generated by large ensembles of neurons. They are also used in networks analysis and neuroinformatics in particular to model a continuum of neural networks. They are mathematical models that describe the average behavior of these congregations of neurons, which are often in large numbers, even in small cortexes of the brain. Therefore, change of average activity (potential, connectivity, firing rate, etc.) are described using systems of partial different equations. In their continuous or discrete forms, thes
APA, Harvard, Vancouver, ISO, and other styles
23

Pang, Xue, Jian Wang, Faliang Yin, and Jun Yao. "Construction of elliptic stochastic partial differential equations solver in groundwater flow with convolutional neural networks." Journal of Physics: Conference Series 2083, no. 4 (2021): 042064. http://dx.doi.org/10.1088/1742-6596/2083/4/042064.

Full text
Abstract:
Abstract Elliptic stochastic partial differential equations (SPDEs) play an indispensable role in mathematics, engineering and other fields, and its solution methods emerge in endlessly with the progress of science and technology. In this paper, we make use of the convolutional neural networks (CNNs), which are widely used in machine learning, to construct a solver for SPDEs. The SPDEs with Neumann boundary conditions are considered, and two CNNs are employed. One is used to deal with the essential equation, and the other satisfies the boundary conditions. With the help of the length factor, t
APA, Harvard, Vancouver, ISO, and other styles
24

BÄKER, M., T. KALKREUTER, G. MACK, and M. SPEH. "NEURAL MULTIGRID METHODS FOR GAUGE THEORIES AND OTHER DISORDERED SYSTEMS." International Journal of Modern Physics C 04, no. 02 (1993): 239–47. http://dx.doi.org/10.1142/s0129183193000252.

Full text
Abstract:
We present evidence that multigrid works for wave equations in disordered systems, e.g. in the presence of gauge fields, no matter how strong the disorder, but one needs to introduce a "neural computations" point of view into large scale simulations: First, the system must learn how to do the simulations efficiently, then do the simulation (fast). The method can also be used to provide smooth interpolation kernels which are needed in multigrid Monte Carlo updates.
APA, Harvard, Vancouver, ISO, and other styles
25

Di Carlo, D., D. Heitz, and T. Corpetti. "Post Processing Sparse And Instantaneous 2D Velocity Fields Using Physics-Informed Neural Networks." Proceedings of the International Symposium on the Application of Laser and Imaging Techniques to Fluid Mechanics 20 (July 11, 2022): 1–11. http://dx.doi.org/10.55037/lxlaser.20th.183.

Full text
Abstract:
This work tackles the problem of resolving high-resolution velocity fields from a set of sparse off-grid observations. We follow the framework of Physics-Informed Neural Networks where simple Multi-layer Perceptor (MLP) are trained to solve partial differential equations (PDEs). In contrast with other state-of-the-art methods based of Convolutional Neural Networks, these models can be applied to super-resolve sparse Lagrangian velocity measurements. Moreover, such a framework can be easily extended to output divergence-free quantities and offer simple implementation of prior physical as regula
APA, Harvard, Vancouver, ISO, and other styles
26

Peng, Liangrong, and Liu Hong. "Recent Advances in Conservation–Dissipation Formalism for Irreversible Processes." Entropy 23, no. 11 (2021): 1447. http://dx.doi.org/10.3390/e23111447.

Full text
Abstract:
The main purpose of this review is to summarize the recent advances of the Conservation–Dissipation Formalism (CDF), a new way for constructing both thermodynamically compatible and mathematically stable and well-posed models for irreversible processes. The contents include but are not restricted to the CDF’s physical motivations, mathematical foundations, formulations of several classical models in mathematical physics from master equations and Fokker–Planck equations to Boltzmann equations and quasi-linear Maxwell equations, as well as novel applications in the fields of non-Fourier heat con
APA, Harvard, Vancouver, ISO, and other styles
27

Li, Zhenyu. "A Review of Physics-Informed Neural Networks." Applied and Computational Engineering 133, no. 1 (2025): 165–73. https://doi.org/10.54254/2755-2721/2025.20636.

Full text
Abstract:
This article presents Physics-Informed Neural Networks (PINNs), which integrate physical laws into neural network training to model complex systems governed by partial differential equations (PDEs). PINNs enhance data efficiency, allowing for accurate predictions with less training data, and have applications in fields such as biomedical engineering, geophysics, and material science. Despite their advantages, PINNs face challenges like learning high-frequency components and computational overhead. Proposed solutions include causality constraints and improved boundary condition handling. A nume
APA, Harvard, Vancouver, ISO, and other styles
28

Aqil, Marco, Selen Atasoy, Morten L. Kringelbach, and Rikkert Hindriks. "Graph neural fields: A framework for spatiotemporal dynamical models on the human connectome." PLOS Computational Biology 17, no. 1 (2021): e1008310. http://dx.doi.org/10.1371/journal.pcbi.1008310.

Full text
Abstract:
Tools from the field of graph signal processing, in particular the graph Laplacian operator, have recently been successfully applied to the investigation of structure-function relationships in the human brain. The eigenvectors of the human connectome graph Laplacian, dubbed “connectome harmonics”, have been shown to relate to the functionally relevant resting-state networks. Whole-brain modelling of brain activity combines structural connectivity with local dynamical models to provide insight into the large-scale functional organization of the human brain. In this study, we employ the graph La
APA, Harvard, Vancouver, ISO, and other styles
29

Hu, Beichao, and Dwayne McDaniel. "Applying Physics-Informed Neural Networks to Solve Navier–Stokes Equations for Laminar Flow around a Particle." Mathematical and Computational Applications 28, no. 5 (2023): 102. http://dx.doi.org/10.3390/mca28050102.

Full text
Abstract:
In recent years, Physics-Informed Neural Networks (PINNs) have drawn great interest among researchers as a tool to solve computational physics problems. Unlike conventional neural networks, which are black-box models that “blindly” establish a correlation between input and output variables using a large quantity of labeled data, PINNs directly embed physical laws (primarily partial differential equations) within the loss function of neural networks. By minimizing the loss function, this approach allows the output variables to automatically satisfy physical equations without the need for labele
APA, Harvard, Vancouver, ISO, and other styles
30

Shinde, Rajwardhan, Onkar Dherange, Rahul Gavhane, Hemant Koul, and Nilam Patil. "HANDWRITTEN MATHEMATICAL EQUATION SOLVER." International Journal of Engineering Applied Sciences and Technology 6, no. 10 (2022): 146–49. http://dx.doi.org/10.33564/ijeast.2022.v06i10.018.

Full text
Abstract:
With recent developments in Artificial intelligence and deep learning every major field which is using computers for any type of work is trying to ease the work using deep learning methods. Deep learning is used in a wide range of fields due to its diverse range of applications like health, sports, robotics, education, etc. In deep learning, a Convolutional neural network (CNN) is being used in image classification, pattern recognition, Text classification, face recognition, live monitoring systems, handwriting recognition, Digit recognition, etc. In this paper, we propose a system for educati
APA, Harvard, Vancouver, ISO, and other styles
31

Liu, Xiangdong, and Yu Gu. "Study of Pricing of High-Dimensional Financial Derivatives Based on Deep Learning." Mathematics 11, no. 12 (2023): 2658. http://dx.doi.org/10.3390/math11122658.

Full text
Abstract:
Many problems in the fields of finance and actuarial science can be transformed into the problem of solving backward stochastic differential equations (BSDE) and partial differential equations (PDEs) with jumps, which are often difficult to solve in high-dimensional cases. To solve this problem, this paper applies the deep learning algorithm to solve a class of high-dimensional nonlinear partial differential equations with jump terms and their corresponding backward stochastic differential equations (BSDEs) with jump terms. Using the nonlinear Feynman-Kac formula, the problem of solving this k
APA, Harvard, Vancouver, ISO, and other styles
32

Chen, Yuxuan, Ce Wang, Yuan Hui, Nirav Vasant Shah, and Mark Spivack. "Surface Profile Recovery from Electromagnetic Fields with Physics-Informed Neural Networks." Remote Sensing 16, no. 22 (2024): 4124. http://dx.doi.org/10.3390/rs16224124.

Full text
Abstract:
Physics-informed neural networks (PINN) have shown their potential in solving both direct and inverse problems of partial differential equations. In this paper, we introduce a PINN-based deep learning approach to reconstruct one-dimensional rough surfaces from field data illuminated by an electromagnetic incident wave. In the proposed algorithm, the rough surface is approximated by a neural network, with which the spatial derivatives of surface function can be obtained via automatic differentiation, and then the scattered field can be calculated using the method of moments. The neural network
APA, Harvard, Vancouver, ISO, and other styles
33

Jeon, Mingyu, Hyun-Jin Jeong, Yong-Jae Moon, Jihye Kang, and Kanya Kusano. "Real-time Extrapolation of Nonlinear Force-free Fields from Photospheric Vector Magnetic Fields Using a Physics-informed Neural Operator." Astrophysical Journal Supplement Series 277, no. 2 (2025): 54. https://doi.org/10.3847/1538-4365/adbaea.

Full text
Abstract:
Abstract In this study, we develop a physics-informed neural operator (PINO) model that learns the solution operator from 2D photospheric vector magnetic fields to 3D nonlinear force-free fields (NLFFFs). We train our PINO model using physics loss from NLFFF partial differential equations, as well as data loss from target NLFFFs. We validate our method using an analytical NLFFF model. We then train and evaluate our PINO model with 2327 numerical NLFFFs of 211 active regions from the Institute for Space-Earth Environmental Research database. The results show that our trained PINO model can gene
APA, Harvard, Vancouver, ISO, and other styles
34

Yang, Zhou, Yuwang Xu, Jionglin Jing, et al. "Investigation of Physics-Informed Neural Networks to Reconstruct a Flow Field with High Resolution." Journal of Marine Science and Engineering 11, no. 11 (2023): 2045. http://dx.doi.org/10.3390/jmse11112045.

Full text
Abstract:
Particle image velocimetry (PIV) is a widely used experimental technique in ocean engineering, for instance, to study the vortex fields near marine risers and the wake fields behind wind turbines or ship propellers. However, the flow fields measured using PIV in water tanks or wind tunnels always have low resolution; hence, it is difficult to accurately reveal the mechanics behind the complex phenomena sometimes observed. In this paper, physics-informed neural networks (PINNs), which introduce the Navier–Stokes equations or the continuity equation into the loss function during training to reco
APA, Harvard, Vancouver, ISO, and other styles
35

Jain, Kirti Kumar, Sarla Raigar, Harsha Tavse, and Manoj Sharma. "Leveraging Artificial Intelligence for the Solution of Differential Equations: A Novel Approach." International Scientific Journal of Engineering and Management 04, no. 03 (2025): 1–6. https://doi.org/10.55041/isjem02355.

Full text
Abstract:
Differential equations are fundamental to the modeling of dynamical systems in various scientific fields, yet solving them analytically remains challenging in many cases. This paper explores the application of artificial intelligence (AI) methods, particularly machine learning algorithms, to solve complex differential equations. We propose a new approach using deep learning models such as neural networks to approximate solutions to both ordinary and partial differential equations without the need for closed-form analytical solutions. The models are trained using datasets generated from known s
APA, Harvard, Vancouver, ISO, and other styles
36

Sharma, Nishchal. "Deep Learning for Solving Partial Differential Equations: A Review of Literature." International Journal for Research in Applied Science and Engineering Technology 12, no. 10 (2024): 588–91. http://dx.doi.org/10.22214/ijraset.2024.64623.

Full text
Abstract:
Partial Differential Equations (PDEs) are fundamental in modeling various phenomena in physics, engineering, and finance. Traditional numerical methods for solving PDEs, such as finite element and finite difference methods, often face limitations when applied to high-dimensional and complex systems. In recent years, deep learning has emerged as a promising alternative for approximating solutions to PDEs, offering potential improvements in both efficiency and scalability. This paper provides a comprehensive review of the literature on deep learning-based methods for solving PDEs, focusing on ke
APA, Harvard, Vancouver, ISO, and other styles
37

Ta, Hoa, Shi Wen Wong, Nathan McClanahan, Jung-Han Kimn, and Kaiqun Fu. "Exploration on Physics-Informed Neural Networks on Partial Differential Equations (Student Abstract)." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 13 (2023): 16344–45. http://dx.doi.org/10.1609/aaai.v37i13.27032.

Full text
Abstract:
Data-driven related solutions are dominating various scientific fields with the assistance of machine learning and data analytics. Finding effective solutions has long been discussed in the area of machine learning. The recent decade has witnessed the promising performance of the Physics-Informed Neural Networks (PINN) in bridging the gap between real-world scientific problems and machine learning models. In this paper, we explore the behavior of PINN in a particular range of different diffusion coefficients under specific boundary conditions. In addition, different initial conditions of parti
APA, Harvard, Vancouver, ISO, and other styles
38

Alkhezi, Yousuf, Yousuf Almubarak, and Ahmad Shafee. "Neural-network-based approximations for investigating a Pantograph delay differential equation with application in Algebra." International Journal of Mathematics and Computer Science 20, no. 1 (2024): 195–209. http://dx.doi.org/10.69793/ijmcs/01.2025/ahmad.

Full text
Abstract:
Delay differential equations (DDE) have applications in many different scientific disciplines and show up in mathematical models of processes that evolve through time, where the rate of evolution is conditional on both the present and past states of the process. They also have significant applications in Algebra, where they are used to solve problems involving sequences and series, as well as in the analysis of algorithms and computational methods. Extensive new studies in fields as disparate as biology, economics, and physics all point to the importance of DDEs. When models based on ordinary
APA, Harvard, Vancouver, ISO, and other styles
39

LIU Ming, ZHANG Si-Qi, and LI Hong. "Dynamic Analysis of Shortcuts to Adiabaticity Based On Physical Information Neural Network." Acta Physica Sinica 74, no. 11 (2025): 0. https://doi.org/10.7498/aps.74.20250147.

Full text
Abstract:
This paper proposes a quantum shortcuts to adiabaticity scheme based on physics-informed neural networks. Compared with traditional shortcuts to adiabaticity techniques, our approach innovatively integrates machine learning methodologies by employing parameterized physics-informed neural networks to solve parameterized differential equations. The neural networks serves as an approximating function for quantum adiabatic evolution processes, while incorporating parameter-dependent differential equations and various physical constraints as components of the loss function. Through networks trainin
APA, Harvard, Vancouver, ISO, and other styles
40

ATALAY, VOLKAN, and EROL GELENBE. "PARALLEL ALGORITHM FOR COLOUR TEXTURE GENERATION USING THE RANDOM NEURAL NETWORK MODEL." International Journal of Pattern Recognition and Artificial Intelligence 06, no. 02n03 (1992): 437–46. http://dx.doi.org/10.1142/s0218001492000266.

Full text
Abstract:
We propose a parallel algorithm for the generation of colour textures based upon the non-linear equations of the "multiple class random neural network model". A neuron is used to obtain the texture value of each pixel in the bit-map plane. Each neuron interacts with its immediate planar neighbours in order to obtain the texture for the whole plane. A model which uses at most 4(C2 + C) parameters for the whole network, where C is the number of colours, is proposed. Numerical iterations of the non-linear field equations of the neural network model, starting with a randomly generated image, are s
APA, Harvard, Vancouver, ISO, and other styles
41

Schaback, Robert, and Holger Wendland. "Kernel techniques: From machine learning to meshless methods." Acta Numerica 15 (May 2006): 543–639. http://dx.doi.org/10.1017/s0962492906270016.

Full text
Abstract:
Kernels are valuable tools in various fields of numerical analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and machine learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers the links between them, in so far as they are related to kernel techniques. It addresses non-expert readers and focuses on practical guidelines for using kernels in applications.
APA, Harvard, Vancouver, ISO, and other styles
42

Baazeem, Amani S., Muhammad Shoaib Arif, and Kamaleldin Abodayeh. "An Efficient and Accurate Approach to Electrical Boundary Layer Nanofluid Flow Simulation: A Use of Artificial Intelligence." Processes 11, no. 9 (2023): 2736. http://dx.doi.org/10.3390/pr11092736.

Full text
Abstract:
Engineering and technological research groups are becoming interested in neural network techniques to improve productivity, business strategies, and societal development. In this paper, an explicit numerical scheme is given for both linear and nonlinear differential equations. The scheme is correct to second order. Additionally, the scheme’s consistency and stability are guaranteed. Backpropagation of Levenberg–Marquardt, the effect of including an induced magnetic field in a mathematical model for electrical boundary layer nanofluid flow on a flat plate, is quantitatively investigated using a
APA, Harvard, Vancouver, ISO, and other styles
43

Chen, Simin, Zhixiang Liu, Wenbo Zhang, and Jinkun Yang. "A Hard-Constraint Wide-Body Physics-Informed Neural Network Model for Solving Multiple Cases in Forward Problems for Partial Differential Equations." Applied Sciences 14, no. 1 (2023): 189. http://dx.doi.org/10.3390/app14010189.

Full text
Abstract:
In the fields of physics and engineering, it is crucial to understand phase transition dynamics. This field involves fundamental partial differential equations (PDEs) such as the Allen–Cahn, Burgers, and two-dimensional (2D) wave equations. In alloys, the evolution of the phase transition interface is described by the Allen–Cahn equation. Vibrational and wave phenomena during phase transitions are modeled using the Burgers and 2D wave equations. The combination of these equations gives comprehensive information about the dynamic behavior during a phase transition. Numerical modeling methods su
APA, Harvard, Vancouver, ISO, and other styles
44

Williams, Kyle, Stephen Rudin, Daniel Bednarek, et al. "226 Advancing Neurovascular Diagnostics for Abnormal Hemodynamic Conditions Through AI-Driven Physics-informed Neural Networks." Neurosurgery 70, Supplement_1 (2024): 61. http://dx.doi.org/10.1227/neu.0000000000002809_226.

Full text
Abstract:
INTRODUCTION: Many studies have explored the application of machine learning and neural networks in extracting critical diagnostic information from structural and functional medical imaging. While these methods show potential for improving efficiency, concerns arise when interpreting subtle imaging features, especially when training data is limited. Physics-informed neural networks (PINNs) address these issues by incorporating governing equations from physical and mechanical models into the analysis. METHODS: We examined the use of a PINN that enforces the convection equation, relating contras
APA, Harvard, Vancouver, ISO, and other styles
45

Touboul, Jonathan. "Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions." Physica D: Nonlinear Phenomena 241, no. 15 (2012): 1223–44. http://dx.doi.org/10.1016/j.physd.2012.03.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

ATALAY, VOLKAN, EROL GELENBE, and NESE YALABIK. "THE RANDOM NEURAL NETWORK MODEL FOR TEXTURE GENERATION." International Journal of Pattern Recognition and Artificial Intelligence 06, no. 01 (1992): 131–41. http://dx.doi.org/10.1142/s0218001492000072.

Full text
Abstract:
The generation of artifical textures is a useful function in image synthesis systems. The purpose of this paper is to describe the use of the random neural network (RN) model developed by Gelenbe to generate various textures having different characteristics. An eight parameter model, based on a choice of the local interaction parameters between neighbouring neurons in the plane, is proposed. Numerical iterations of the field equations of the neural network model, starting with a randomly generated gray-level image, are shown to produce textures having different desirable features such as granu
APA, Harvard, Vancouver, ISO, and other styles
47

Jakeer, Shaik, Seethi Reddy Reddisekhar Reddy, Sathishkumar Veerappampalayam Easwaramoorthy, Hayath Thameem Basha, and Jaehyuk Cho. "Exploring the Influence of Induced Magnetic Fields and Double-Diffusive Convection on Carreau Nanofluid Flow through Diverse Geometries: A Comparative Study Using Numerical and ANN Approaches." Mathematics 11, no. 17 (2023): 3687. http://dx.doi.org/10.3390/math11173687.

Full text
Abstract:
This current investigation aims to explore the significance of induced magnetic fields and double-diffusive convection in the radiative flow of Carreau nanofluid through three distinct geometries. To simplify the fluid transport equations, appropriate self-similarity variables were employed, converting them into ordinary differential equations. These equations were subsequently solved using the Runge–Kutta–Fehlberg (RKF) method. Through graphical representations like graphs and tables, the study demonstrates how various dynamic factors influence the fluid’s transport characteristics. Additiona
APA, Harvard, Vancouver, ISO, and other styles
48

Ara, Asmat, Oyoon Abdul Razzaq, and Najeeb Alam Khan. "A Single Layer Functional Link Artificial Neural Network based on Chebyshev Polynomials for Neural Evaluations of Nonlinear Nth Order Fuzzy Differential Equations." Annals of West University of Timisoara - Mathematics and Computer Science 56, no. 1 (2018): 3–22. http://dx.doi.org/10.2478/awutm-2018-0001.

Full text
Abstract:
Abstract Bearing in mind the considerable importance of fuzzy differential equations (FDEs) in different fields of science and engineering, in this paper, nonlinear nth order FDEs are approximated, heuristically. The analysis is carried out on using Chebyshev neural network (ChNN), which is a type of single layer functional link artificial neural network (FLANN). Besides, explication of generalized Hukuhara differentiability (gH-differentiability) is also added for the nth order differentiability of fuzzy-valued functions. Moreover, general formulation of the structure of ChNN for the governin
APA, Harvard, Vancouver, ISO, and other styles
49

Pioch, Fabian, Jan Hauke Harmening, Andreas Maximilian Müller, Franz-Josef Peitzmann, Dieter Schramm, and Ould el Moctar. "Turbulence Modeling for Physics-Informed Neural Networks: Comparison of Different RANS Models for the Backward-Facing Step Flow." Fluids 8, no. 2 (2023): 43. http://dx.doi.org/10.3390/fluids8020043.

Full text
Abstract:
Physics-informed neural networks (PINN) can be used to predict flow fields with a minimum of simulated or measured training data. As most technical flows are turbulent, PINNs based on the Reynolds-averaged Navier–Stokes (RANS) equations incorporating a turbulence model are needed. Several studies demonstrated the capability of PINNs to solve the Naver–Stokes equations for laminar flows. However, little work has been published concerning the application of PINNs to solve the RANS equations for turbulent flows. This study applied a RANS-based PINN approach to a backward-facing step flow at a Rey
APA, Harvard, Vancouver, ISO, and other styles
50

Borghi, Giacomo, Elisa Iacomini, Mathias Oster, and Chiara Segala. "Mini-Workshop: High-Dimensional Control Problems and Mean-Field Equations with Applications in Machine Learning." Oberwolfach Reports 21, no. 4 (2025): 3211–54. https://doi.org/10.4171/owr/2024/56.

Full text
Abstract:
High-dimensional control problems and mean field equations have been of increased interest in recent years and novel numerical tools tackling the curse of dimensionality have been developed. These optimization tasks are strongly related to learning problems such as data-driven optimal control and learning of deep neural networks. As a consequence, there is a huge potential to employ control theoretical techniques in Machine Learning. The Mini-Workshop was devoted to discuss possible synergies among the various tools developed in these fields.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!