To see the other types of publications on this topic, follow the link: Hopfield Neural Networks.

Journal articles on the topic 'Hopfield Neural Networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Hopfield Neural Networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Kobayashi, Masaki. "Storage Capacities of Twin-Multistate Quaternion Hopfield Neural Networks." Computational Intelligence and Neuroscience 2018 (November 1, 2018): 1–5. http://dx.doi.org/10.1155/2018/1275290.

Full text
Abstract:
A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.
APA, Harvard, Vancouver, ISO, and other styles
2

Wilson, Robert C. "Parallel Hopfield Networks." Neural Computation 21, no. 3 (2009): 831–50. http://dx.doi.org/10.1162/neco.2008.03-07-496.

Full text
Abstract:
We introduce a novel type of neural network, termed the parallel Hopfield network, that can simultaneously effect the dynamics of many different, independent Hopfield networks in parallel in the same piece of neural hardware. Numerically we find that under certain conditions, each Hopfield subnetwork has a finite memory capacity approaching that of the equivalent isolated attractor network, while a simple signal-to-noise analysis sheds qualitative, and some quantitative, insight into the workings (and failures) of the system.
APA, Harvard, Vancouver, ISO, and other styles
3

Kobayashi, Masaki. "Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks." Neural Computation 32, no. 11 (2020): 2237–48. http://dx.doi.org/10.1162/neco_a_01320.

Full text
Abstract:
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
APA, Harvard, Vancouver, ISO, and other styles
4

Ismailov, Mirxalil, Davron Ziyadullaev, Dilnoz Muhamediyeva, Rano Gazieva, Aksulu Dzholdasbaeva, and Sharofiddin Aynaqulov. "Intelligent algorithms of construction of public transport routes." E3S Web of Conferences 365 (2023): 01002. http://dx.doi.org/10.1051/e3sconf/202336501002.

Full text
Abstract:
Today, in public transport planning systems, it is relevant to a search for a possible route with a minimum time. The aim of the work is the development of intelligent algorithms for constructing public transport routes, the development of programs, and the conduct of a computational experiment. Research methods are the theory of neural networks. The paper considers Hopfield neural networks and proposed recurrent neural networks. However, in Hopfield neural networks, the chances of solving this optimization problem decrease as the matrix size increases. A recurrent neural network is proposed, represented by a differential equation to solve this problem. As a result, the number of iterative computations can be reduced by n2 times than in the Hopfield network.
APA, Harvard, Vancouver, ISO, and other styles
5

Feild, William B., and Jainendra K. Navlakha. "On Hopfield neural networks." Neural Networks 1 (January 1988): 21. http://dx.doi.org/10.1016/0893-6080(88)90063-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kobayashi, M. "Hyperbolic Hopfield Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 24, no. 2 (2013): 335–41. http://dx.doi.org/10.1109/tnnls.2012.2230450.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hu, Shigeng, Xiaoxin Liao, and Xuerong Mao. "Stochastic Hopfield neural networks." Journal of Physics A: Mathematical and General 36, no. 9 (2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lai, Xiao Feng, Xing Yin, and Hai Yang Zou. "Stability Criterion of Discrete Hopfield Neural Networks with Multiple Delays in Parallel Mode: Linear Matrix Inequality." Advanced Materials Research 225-226 (April 2011): 1270–73. http://dx.doi.org/10.4028/www.scientific.net/amr.225-226.1270.

Full text
Abstract:
In this paper, Discrete Hopfield Neural Networks with Multple Delays is introduced.And a stability criterion of Discrete Hopfield Neural Networks with Multple Delays is studied bythe linear matrix inequality. It provides a theory basis for the application of Discrete Hopfield Neural Networks with Multple Delays.
APA, Harvard, Vancouver, ISO, and other styles
9

Fan, Tongke. "Research on Optimal CDMA Multiuser Detection Based on Stochastic Hopfield Neural Network." Recent Patents on Computer Science 12, no. 3 (2019): 233–40. http://dx.doi.org/10.2174/2213275912666181210103742.

Full text
Abstract:
Background: Most of the common multi-user detection techniques have the shortcomings of large computation and slow operation. For Hopfield neural networks, there are some problems such as high-speed searching ability and parallel processing, but there are local convergence problems. Objective: The stochastic Hopfield neural network avoids local convergence by introducing noise into the state variables and then achieves the optimal detection. Methods: Based on the study of CDMA communication model, this paper presents and models the problem of multi-user detection. Then a new stochastic Hopfield neural network is obtained by introducing a stochastic disturbance into the traditional Hopfield neural network. Finally, the problem of CDMA multi-user detection is simulated. Conclusion: The results show that the introduction of stochastic disturbance into Hopfield neural network can help the neural network to jump out of the local minimum, thus achieving the minimum and improving the performance of the neural network.
APA, Harvard, Vancouver, ISO, and other styles
10

YAN, JUN-JUH, TEH-LU LIAO, JUI-SHENG LIN, and CHAO-JUNG CHENG. "SYNCHRONIZATION CONTROL OF NEURAL NETWORKS SUBJECT TO TIME-VARYING DELAYS AND INPUT NONLINEARITY." International Journal of Bifurcation and Chaos 16, no. 12 (2006): 3643–54. http://dx.doi.org/10.1142/s0218127406017038.

Full text
Abstract:
This paper investigates the synchronization problem for a particular class of neural networks subject to time-varying delays and input nonlinearity. Using the variable structure control technique, a memoryless decentralized control law is established which guarantees exponential synchronization even when input nonlinearity is present. The proposed controller is suitable for application in delayed cellular neural networks and Hopfield neural networks with no restriction on the derivative of the time-varying delays. A two-dimensional cellular neural network and a four-dimensional Hopfield neural network, both with time-varying delays, are presented as illustrative examples to demonstrate the effectiveness of the proposed synchronization scheme.
APA, Harvard, Vancouver, ISO, and other styles
11

SUKHASWAMI, M. B., P. SEETHARAMULU, and ARUN K. PUJARI. "RECOGNITION OF TELUGU CHARACTERS USING NEURAL NETWORKS." International Journal of Neural Systems 06, no. 03 (1995): 317–57. http://dx.doi.org/10.1142/s0129065795000238.

Full text
Abstract:
The aim of the present work is to recognize printed and handwritten Telugu characters using artificial neural networks (ANNs). Earlier work on recognition of Telugu characters has been done using conventional pattern recognition techniques. We make an initial attempt here of using neural networks for recognition with the aim of improving upon earlier methods which do not perform effectively in the presence of noise and distortion in the characters. The Hopfield model of neural network working as an associative memory is chosen for recognition purposes initially. Due to limitation in the capacity of the Hopfield neural network, we propose a new scheme named here as the Multiple Neural Network Associative Memory (MNNAM). The limitation in storage capacity has been overcome by combining multiple neural networks which work in parallel. It is also demonstrated that the Hopfield network is suitable for recognizing noisy printed characters as well as handwritten characters written by different “hands” in a variety of styles. Detailed experiments have been carried out using several learning strategies and results are reported. It is shown here that satisfactory recognition is possible using the proposed strategy. A detailed preprocessing scheme of the Telugu characters from digitized documents is also described.
APA, Harvard, Vancouver, ISO, and other styles
12

Singh, Kirti, Suju M. George, and P. Rambabu. "Use of a System of Recurrent Neural Networks for Solving the Cell Placement Problem in VLSI Design." International Journal on Artificial Intelligence Tools 06, no. 01 (1997): 15–35. http://dx.doi.org/10.1142/s0218213097000037.

Full text
Abstract:
Cell placement in VLSI design is an NP-complete problem. In this paper, we have tried to solve the standard cell placement problem using the Hopfield neural network model. Furthermore, a new system of coupled recurrent neural networks, which was designed to eliminate the drawbacks of the Hopfield neural network, is introduced. The performance of Hopfield networks with discrete and graded neurons is also investigated. The energy function corresponding to the chosen representation is given and the weight matrix and the inputs needed for the network are also computed in this work. Several different combinations of parameters were examined to find the optimum set of parameters which results in better and faster convergence. To show the effectiveness of our methods, cell placement problems up to 30 cells are considered and the results are compared with the results obtained by a genetic algorithm. Our results show that a system of coupled neural networks could be an efficient way to overcome the limitation of recurrent neural networks which consider only bilinear forms of the energy function.
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Li‐Xin, and Jerry M. Mendel. "Adaptive minimum prediction‐error deconvolution and source wavelet estimation using Hopfield neural networks." GEOPHYSICS 57, no. 5 (1992): 670–79. http://dx.doi.org/10.1190/1.1443281.

Full text
Abstract:
The massively parallel processing advantage of artificial neural networks makes them suitable for hardware implementations; therefore, using artificial neural networks for seismic signal processing problems has the potential of greatly speeding up seismic signal processing. A commonly used artificial neural network—Hopfield neural network—is used to implement a new adaptive minimum prediction‐error deconvolution (AMPED) procedure which decomposes deconvolution and wavelet estimation into three subprocesses: reflectivity location detection, reflectivity magnitude estimation, and source wavelet extraction. A random reflectivity model is not required. The basic idea of the approach is to relate the cost functions of the deconvolution and wavelet estimation problem with the energy functions of these Hopfield neural networks so that when these neural networks reach their stable states, for which the energy functions are locally minimized, the outputs of the networks give the solution to the deconvolution and wavelet estimation problem. Three Hopfield neural networks are constructed to implement the three subprocesses, respectively, and they are then connected in an iterative way to implement the entire deconvolution and wavelet estimation procedure. This approach is applied to synthetic and real seismic traces, and the results show that: (1) the Hopfield neural networks converge to their stable states in only one to four iterations; hence, this approach gives a solution to the deconvolution and wavelet estimation problem very quickly; (2) this approach works impressively well in the cases of low signal‐to‐noise ratio and nonminimum phase wavelets; and (3) this approach can treat backscatter either as noise or as useful signal.
APA, Harvard, Vancouver, ISO, and other styles
14

Savin, Daniela, Sabah Alkass, and Paul Fazio. "Construction resource leveling using neural networks." Canadian Journal of Civil Engineering 23, no. 4 (1996): 917–25. http://dx.doi.org/10.1139/l96-898.

Full text
Abstract:
A neural network model for construction resource leveling is developed and discussed. The model is derived by mapping an augmented Lagrangian multiplier optimization formulation of a resource leveling problem onto a discrete-time Hopfield net. The resulting neural network model consists of two main blocks. Specifically, it consists of a discrete-time Hopfield neural network block, and a control block for the adjustment of Lagrange multipliers in the augmented Lagrangian multiplier optimization, and for the computation of the new set of weights of the neural network block. An experimental verification of the proposed artificial neural network model is also provided. Key words: neural networks in construction, resource leveling, construction management, project management.
APA, Harvard, Vancouver, ISO, and other styles
15

KOBAYASHI, Masaki. "Global Hyperbolic Hopfield Neural Networks." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E99.A, no. 12 (2016): 2511–16. http://dx.doi.org/10.1587/transfun.e99.a.2511.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Leblebicioğlu, Kemal, Ugur Halici, and Okay Çelebi. "Infinite dimensional Hopfield neural networks." Nonlinear Analysis: Theory, Methods & Applications 47, no. 9 (2001): 5807–13. http://dx.doi.org/10.1016/s0362-546x(01)00710-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Kobayashi, Masaki. "Symmetric quaternionic Hopfield neural networks." Neurocomputing 240 (May 2017): 110–14. http://dx.doi.org/10.1016/j.neucom.2017.02.044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Kobayashi, Masaki. "Diagonal rotor Hopfield neural networks." Neurocomputing 415 (November 2020): 40–47. http://dx.doi.org/10.1016/j.neucom.2020.07.041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Kobayashi, Masaki. "Dual-numbered Hopfield neural networks." IEEJ Transactions on Electrical and Electronic Engineering 13, no. 2 (2017): 280–84. http://dx.doi.org/10.1002/tee.22524.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Zhang, Yu, Bin Chen, Lan Li, Yaoqun Xu, Sifan Wei, and Yu Wang. "The Effect of Blue Noise on the Optimization Ability of Hopfield Neural Network." Applied Sciences 13, no. 10 (2023): 6028. http://dx.doi.org/10.3390/app13106028.

Full text
Abstract:
Noise is ubiquitous in the real-world environment. At present, most scholars only include the stage of Gaussian white noise when applying noise in neural networks and regard white noise as a tool to optimize the network model, which is far from enough, because noise not only affects the optimization ability of the Hopfield neural network but can also better fit the needs of the actual use of the scene. Therefore, according to the problems in the existing research, a method is proposed to combine the neural network with colored noise according to the signal-to-noise ratio. Taking blue noise as an example, the anti-interference ability of the Hopfield neural network regarding colored noise is studied. The results show that for the Hopfield neural network driven by blue noise, by adjusting the neural network step size, excitation function and signal-to-noise ratio, it not only provides ideas for adding colored noise to the neural network but also enables the neural network model to have better optimization-seeking ability. The research results have some reference significance for improving the practical application of neural networks in noisy environments.
APA, Harvard, Vancouver, ISO, and other styles
21

Qiu, Rong, Yujiao Dong, Xin Jiang, and Guangyi Wang. "Two-Neuron Based Memristive Hopfield Neural Network with Synaptic Crosstalk." Electronics 11, no. 19 (2022): 3034. http://dx.doi.org/10.3390/electronics11193034.

Full text
Abstract:
Synaptic crosstalk is an important biological phenomenon that widely exists in neural networks. The crosstalk can influence the ability of neurons to control the synaptic weights, thereby causing rich dynamics of neural networks. Based on the crosstalk between synapses, this paper presents a novel two-neuron based memristive Hopfield neural network with a hyperbolic memristor emulating synaptic crosstalk. The dynamics of the neural networks with varying memristive parameters and crosstalk weights are analyzed via the phase portraits, time-domain waveforms, bifurcation diagrams, and basin of attraction. Complex phenomena, especially coexisting dynamics, chaos and transient chaos emerge in the neural network. Finally, the circuit simulation results verify the effectiveness of theoretical analyses and mathematical simulation and further illustrate the feasibility of the two-neuron based memristive Hopfield neural network hardware.
APA, Harvard, Vancouver, ISO, and other styles
22

Boykov, Ilya, Vladimir Roudnev, and Alla Boykova. "Approximate Methods for Solving Problems of Mathematical Physics on Neural Hopfield Networks." Mathematics 10, no. 13 (2022): 2207. http://dx.doi.org/10.3390/math10132207.

Full text
Abstract:
A Hopfield neural network is described by a system of nonlinear ordinary differential equations. We develop a broad range of numerical schemes that are applicable for a wide range of computational problems. We review here our study on an approximate solution of the Fredholm integral equation, and linear and nonlinear singular and hypersingular integral equations, using a continuous method for solving operator equations. This method assumes that the original system is associated with a Cauchy problem for systems of ordinary differential equations on Hopfield neural networks. We present sufficient conditions for the Hopfield networks’ stability defined via coefficients of systems of differential equations.
APA, Harvard, Vancouver, ISO, and other styles
23

Kobayashi, Masaki. "Noise Robust Projection Rule for Klein Hopfield Neural Networks." Neural Computation 33, no. 6 (2021): 1698–716. http://dx.doi.org/10.1162/neco_a_01385.

Full text
Abstract:
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of KHNNs are extended. Moreover, the projection rule for KHNNs is modified using the extended conditions. The proposed projection rule improves the noise tolerance by a reduction in self-feedback. Computer simulations support that the proposed projection rule improves the noise tolerance of KHNNs.
APA, Harvard, Vancouver, ISO, and other styles
24

ISOKAWA, TEIJIRO, HARUHIKO NISHIMURA, NAOTAKE KAMIURA, and NOBUYUKI MATSUI. "ASSOCIATIVE MEMORY IN QUATERNIONIC HOPFIELD NEURAL NETWORK." International Journal of Neural Systems 18, no. 02 (2008): 135–45. http://dx.doi.org/10.1142/s0129065708001440.

Full text
Abstract:
Associative memory networks based on quaternionic Hopfield neural network are investigated in this paper. These networks are composed of quaternionic neurons, and input, output, threshold, and connection weights are represented in quaternions, which is a class of hypercomplex number systems. The energy function of the network and the Hebbian rule for embedding patterns are introduced. The stable states and their basins are explored for the networks with three neurons and four neurons. It is clarified that there exist at most 16 stable states, called multiplet components, as the degenerated stored patterns, and each of these states has its basin in the quaternionic networks.
APA, Harvard, Vancouver, ISO, and other styles
25

Huang, Xia, Zhen Wang, and Yuxia Li. "Nonlinear Dynamics and Chaos in Fractional-Order Hopfield Neural Networks with Delay." Advances in Mathematical Physics 2013 (2013): 1–9. http://dx.doi.org/10.1155/2013/657245.

Full text
Abstract:
A fractional-order two-neuron Hopfield neural network with delay is proposed based on the classic well-known Hopfield neural networks, and further, the complex dynamical behaviors of such a network are investigated. A great variety of interesting dynamical phenomena, including single-periodic, multiple-periodic, and chaotic motions, are found to exist. The existence of chaotic attractors is verified by the bifurcation diagram and phase portraits as well.
APA, Harvard, Vancouver, ISO, and other styles
26

Chen, Jing, Xiaodi Li, and Dequan Wang. "Asymptotic Stability and Exponential Stability of Impulsive Delayed Hopfield Neural Networks." Abstract and Applied Analysis 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/638496.

Full text
Abstract:
A criterion for the uniform asymptotic stability of the equilibrium point of impulsive delayed Hopfield neural networks is presented by using Lyapunov functions and linear matrix inequality approach. The criterion is a less restrictive version of a recent result. By means of constructing the extended impulsive Halanay inequality, we also analyze the exponential stability of impulsive delayed Hopfield neural networks. Some new sufficient conditions ensuring exponential stability of the equilibrium point of impulsive delayed Hopfield neural networks are obtained. An example showing the effectiveness of the present criterion is given.
APA, Harvard, Vancouver, ISO, and other styles
27

Kobayashi, Masaki. "Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules." Computational Intelligence and Neuroscience 2017 (2017): 1–6. http://dx.doi.org/10.1155/2017/4894278.

Full text
Abstract:
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.
APA, Harvard, Vancouver, ISO, and other styles
28

V. Cojocaru, Andreea, and Stefan Balint. "Neuro-Psychological Interpretations of Mathematical Results Reported in Case of Discrete- Time Hopfield Neural Networks." Journal of Genetic Engineering and Biotechnology Research 7, no. 1 (2025): 01–09. https://doi.org/10.33140/jgebr.07.01.05.

Full text
Abstract:
In this paper, for mathematical descriptions of electrical phenomena (voltage state) appearing in nervous system discrete-time Hopfield neural network is used. The equilibrium states of a discrete-time Hopfield neural network are interpreted as equilibriums of the nervous system. An equilibrium state for which the steady state is locally exponentially stable is interpreted as robust equilibrium of the nervous system. That is because after a small perturbation of the equilibrium steady state the network recover the equilibrium. A path of equilibrium states for which the steady states are locally exponentially stable is interpreted as a path of robust equilibriums of the nervous system. This is a way to follow in healthcare for transfer gradually the nervous system from a pathologic robust equilibrium into a non-pathologic robust equilibrium. For illustration, computed way of transfer is presented
APA, Harvard, Vancouver, ISO, and other styles
29

Kobayashi, Masaki. "Hyperbolic-Valued Hopfield Neural Networks in Synchronous Mode." Neural Computation 32, no. 9 (2020): 1685–96. http://dx.doi.org/10.1162/neco_a_01303.

Full text
Abstract:
For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.
APA, Harvard, Vancouver, ISO, and other styles
30

Sun, Yanxia, Zenghui Wang, and Barend Jacobus van Wyk. "Chaotic Hopfield Neural Network Swarm Optimization and Its Application." Journal of Applied Mathematics 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/873670.

Full text
Abstract:
A new neural network based optimization algorithm is proposed. The presented model is a discrete-time, continuous-state Hopfield neural network and the states of the model are updated synchronously. The proposed algorithm combines the advantages of traditional PSO, chaos and Hopfield neural networks: particles learn from their own experience and the experiences of surrounding particles, their search behavior is ergodic, and convergence of the swarm is guaranteed. The effectiveness of the proposed approach is demonstrated using simulations and typical optimization problems.
APA, Harvard, Vancouver, ISO, and other styles
31

Pérez-Vicente, Conrad J. "Nobel prize in physics 2024: ideas that transcend physics." Europhysics News 56, no. 1 (2025): 13–14. https://doi.org/10.1051/epn/2025105.

Full text
Abstract:
The Hopfield model, a recurrent artificial neural network introduced in 1982, has had profound implications in computational neuroscience, synchronization phenomena, and artificial intelligence (AI). In computational neuroscience, it provides a framework for understanding associative memory, attractor dynamics, and error correction in biological neural systems. The model’s energy minimization properties have also been explored in synchronization studies, particularly in coupled oscillatory systems and network stability analysis. In AI, Hopfield networks have influenced optimization methods, constraint satisfaction problems, and modern deep learning architectures. This paper reviews the fundamental aspects of the Hopfield model and discusses its lasting impact across these domains.
APA, Harvard, Vancouver, ISO, and other styles
32

Pei, Yangjun, Chao Liu, and Qi Han. "Stability of Delayed Hopfield Neural Networks with Variable-Time Impulses." Mathematical Problems in Engineering 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/154036.

Full text
Abstract:
In this paper the globally exponential stability criteria of delayed Hopfield neural networks with variable-time impulses are established. The proposed criteria can also be applied in Hopfield neural networks with fixed-time impulses. A numerical example is presented to illustrate the effectiveness of our theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
33

Peng, Hao Yue, and Guo Hao Zhao. "Resource Industry Operation Study: Cooperation, Constraint and Sustainability." Advanced Materials Research 524-527 (May 2012): 2971–76. http://dx.doi.org/10.4028/www.scientific.net/amr.524-527.2971.

Full text
Abstract:
The history of Human beings is the one of utilizing natural recourses. With the development of the economic, natural resources industry security risk management increasingly becomes an urgent and international issue. Analyzing the resources industry security under the systematic angle, the resources industry risk control is the complex system. This system is full of energy flowing which can be measured by the entropy. Hopfield neural network is the important neural networks model. The use of Hopfield neural network puts extra systematic directionality restraints on such risk control. It makes the objective function and constraints of resources industry security risk control, in terms of negative entropy, optimized by the Hopfield neural network energy function. Then as an effective try, some conclusions about reducing resources industry security risk also can be got.
APA, Harvard, Vancouver, ISO, and other styles
34

Jia, Zhifu, and Cunlin Li. "Almost Sure Exponential Stability of Uncertain Stochastic Hopfield Neural Networks Based on Subadditive Measures." Mathematics 11, no. 14 (2023): 3110. http://dx.doi.org/10.3390/math11143110.

Full text
Abstract:
For this paper, we consider the almost sure exponential stability of uncertain stochastic Hopfield neural networks based on subadditive measures. Firstly, we deduce two corollaries, using the Itô–Liu formula. Then, we introduce the concept of almost sure exponential stability for uncertain stochastic Hopfield neural networks. Next, we investigate the almost sure exponential stability of uncertain stochastic Hopfield neural networks, using the Lyapunov method, Liu inequality, the Liu lemma, and exponential martingale inequality. In addition, we prove two sufficient conditions for almost sure exponential stability. Furthermore, we consider stabilization with linear uncertain stochastic perturbation and present some exceptional examples. Finally, our paper provides our conclusion.
APA, Harvard, Vancouver, ISO, and other styles
35

Pu, Yi-Fei, Zhang Yi, and Ji-Liu Zhou. "Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 28, no. 10 (2017): 2319–33. http://dx.doi.org/10.1109/tnnls.2016.2582512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Anzo-Hernández, Andrés, Ernesto Zambrano-Serrano, Miguel Angel Platas-Garza, and Christos Volos. "Dynamic Analysis and FPGA Implementation of Fractional-Order Hopfield Networks with Memristive Synapse." Fractal and Fractional 8, no. 11 (2024): 628. http://dx.doi.org/10.3390/fractalfract8110628.

Full text
Abstract:
Memristors have become important components in artificial synapses due to their ability to emulate the information transmission and memory functions of biological synapses. Unlike their biological counterparts, which adjust synaptic weights, memristor-based artificial synapses operate by altering conductance or resistance, making them useful for enhancing the processing capacity and storage capabilities of neural networks. When integrated into systems like Hopfield neural networks, memristors enable the study of complex dynamic behaviors, such as chaos and multistability. Moreover, fractional calculus is significant for their ability to model memory effects, enabling more accurate simulations of complex systems. Fractional-order Hopfield networks, in particular, exhibit chaotic and multistable behaviors not found in integer-order models. By combining memristors with fractional-order Hopfield neural networks, these systems offer the possibility of investigating different dynamic phenomena in artificial neural networks. This study investigates the dynamical behavior of a fractional-order Hopfield neural network (HNN) incorporating a memristor with a piecewise segment function in one of its synapses, highlighting the impact of fractional-order derivatives and memristive synapses on the stability, robustness, and dynamic complexity of the system. Using a network of four neurons as a case study, it is demonstrated that the memristive fractional-order HNN exhibits multistability, coexisting chaotic attractors, and coexisting limit cycles. Through spectral entropy analysis, the regions in the initial condition space that display varying degrees of complexity are mapped, highlighting those areas where the chaotic series approach a pseudo-random sequence of numbers. Finally, the proposed fractional-order memristive HNN is implemented on a Field-Programmable Gate Array (FPGA), demonstrating the feasibility of real-time hardware realization.
APA, Harvard, Vancouver, ISO, and other styles
37

KOBAYASHI, Masaki. "Three-Dimensional Quaternionic Hopfield Neural Networks." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E100.A, no. 7 (2017): 1575–77. http://dx.doi.org/10.1587/transfun.e100.a.1575.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Shouhong Wang and Hai Wang. "Password Authentication Using Hopfield Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 38, no. 2 (2008): 265–68. http://dx.doi.org/10.1109/tsmcc.2007.913901.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Kobayashi, Masaki. "Multistate vector product hopfield neural networks." Neurocomputing 272 (January 2018): 425–31. http://dx.doi.org/10.1016/j.neucom.2017.07.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Guan, Zhi-Hong, and Guanrong Chen. "On delayed impulsive Hopfield neural networks." Neural Networks 12, no. 2 (1999): 273–80. http://dx.doi.org/10.1016/s0893-6080(98)00133-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Li, Qingdu, Xiao-Song Yang, and Fangyan Yang. "Hyperchaos in Hopfield-type neural networks." Neurocomputing 67 (August 2005): 275–80. http://dx.doi.org/10.1016/j.neucom.2005.02.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Kobayashi, Masaki. "Symmetric Complex-Valued Hopfield Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 28, no. 4 (2017): 1011–15. http://dx.doi.org/10.1109/tnnls.2016.2518672.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Kobayashi, Masaki. "$O(2)$ -Valued Hopfield Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 30, no. 12 (2019): 3833–38. http://dx.doi.org/10.1109/tnnls.2019.2897994.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Litinskii, L. B. "High-symmetry Hopfield-type neural networks." Theoretical and Mathematical Physics 118, no. 1 (1999): 107–27. http://dx.doi.org/10.1007/bf02557200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Abaid, Mansour M., and Musbah A. Sharf. "Switching Optimization By Neural Networks." مجلة الجامعة الأسمرية: العلوم التطبيقية 3, no. 1 (2018): 31–38. http://dx.doi.org/10.59743/aujas.v3i1.1616.

Full text
Abstract:
The priority switching problem is a combinatorial optimization problem searching for the best solution for frame forwarding in a switch. To solve this problem, the modified Hopfield neural network, what is able to solve the priority switching problem is introduced.
APA, Harvard, Vancouver, ISO, and other styles
46

Hao, Dao He, and Liang Wu. "Global Stability Properties for Neutral-Type Hopfield Neural Networks." Applied Mechanics and Materials 232 (November 2012): 682–85. http://dx.doi.org/10.4028/www.scientific.net/amm.232.682.

Full text
Abstract:
The global stability properties was discussed for the neutral-type Hopfield neural networks with discrete and distributed time-varying delays .Based on the Lyapunov functional stability analysis and the linear matrix inequality approach, a new sufficient condition was derived to assure the global stability properties of the equilibrium. The criterion improved and extended the results of literature, and has less conservative.
APA, Harvard, Vancouver, ISO, and other styles
47

Liao, Xiaofeng, Jun Wang, and Jinde Cao. "Global and Robust Stability of Interval Hopfield Neural Networks with Time-Varying Delays." International Journal of Neural Systems 13, no. 03 (2003): 171–82. http://dx.doi.org/10.1142/s012906570300142x.

Full text
Abstract:
In this paper, we investigate the problem of global and robust stability of a class of interval Hopfield neural networks that have time-varying delays. Some criteria for the global and robust stability of such networks are derived, by means of constructing suitable Lyapunov functionals for the networks. As a by-product, for the conventional Hopfield neural networks with time-varying delays, we also obtain some new criteria for their global and asymptotic stability.
APA, Harvard, Vancouver, ISO, and other styles
48

JÉZÉQUEL, C., O. NÉROT, and J. DEMONGEOT. ""DYNAMICAL CONFINEMENT" IN NEURAL NETWORKS." Journal of Biological Systems 03, no. 04 (1995): 1157–65. http://dx.doi.org/10.1142/s0218339095001040.

Full text
Abstract:
Randomisation of a well known mathematical model is proposed (i.e. the Hopfield model for neural networks) in order to facilitate the study of its asymptotic behavior: in fact, we replace the determination of the stability basins for attractors and for stability boundaries by the study of a unique invariant measure, whose distribution function maxima (or respectively, percentile contour lines) correspond to the location of the attractors (or respectively, boundaries of their stability basins). We give the name of "confinement" to this localization of the mass of the invariant measure. We intend to show here that the study of the confinement is in certain cases easier than the study of underlying attractors, in particular if these last are numerous and possess small stability basins (for example, for the first time we calculate the invariant measure in the random Hopfield model in a case for which the deterministic version exhibits many attractors, and after in a case of phase transition).
APA, Harvard, Vancouver, ISO, and other styles
49

P, Monurajan, Ruhanbevi A, and Manjula J. "Design of Memristive Hopfield Neural Network using Memristor Bridges." International Journal of Engineering & Technology 7, no. 3.12 (2018): 652. http://dx.doi.org/10.14419/ijet.v7i3.12.16447.

Full text
Abstract:
Artificial Neural Networks are interconnection of neurons inspired from the biological neural network of the brain. ANN is claimed to rule the future, spreads its wings to various areas of interest to name a few such as optimization, information technology, cryptography, image processing and even in medical diagnosis. There are devices which possess synaptic behaviour, one such device is memristor. Bridge circuit of memristors can be combined together to form neurons. Neurons can be made into a network with appropriate parameters to store data or images. Hopfield neural networks are chosen to store the data in associative memory. Hopfield neural networks are a significant feature in ANN which are recurrent in nature and in general are used as associative memory and in solving optimization problems such as the Travelling Salesman Problem. The paper deals on the construction of memristive Hopfield neural network using memristor bridging circuit and its application in the associative memory. This paper also illustrates the experiment with mathematical equations and the associative memory concept of the network using Matlab.
APA, Harvard, Vancouver, ISO, and other styles
50

Sousa, Miguel Angelo de Abreu de, Edson Lemos Horta, Sergio Takeo Kofuji, and Emilio Del-Moral-Hernandez. "Architecture Analysis of an FPGA-Based Hopfield Neural Network." Advances in Artificial Neural Systems 2014 (December 9, 2014): 1–10. http://dx.doi.org/10.1155/2014/602325.

Full text
Abstract:
Interconnections between electronic circuits and neural computation have been a strongly researched topic in the machine learning field in order to approach several practical requirements, including decreasing training and operation times in high performance applications and reducing cost, size, and energy consumption for autonomous or embedded developments. Field programmable gate array (FPGA) hardware shows some inherent features typically associated with neural networks, such as, parallel processing, modular executions, and dynamic adaptation, and works on different types of FPGA-based neural networks were presented in recent years. This paper aims to address different aspects of architectural characteristics analysis on a Hopfield Neural Network implemented in FPGA, such as maximum operating frequency and chip-area occupancy according to the network capacity. Also, the FPGA implementation methodology, which does not employ multipliers in the architecture developed for the Hopfield neural model, is presented, in detail.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!