Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Hopfield Neural Networks.

Articles de revues sur le sujet « Hopfield Neural Networks »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Hopfield Neural Networks ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Lai, Xiao Feng, Xing Yin, and Hai Yang Zou. "Stability Criterion of Discrete Hopfield Neural Networks with Multiple Delays in Parallel Mode: Linear Matrix Inequality." Advanced Materials Research 225-226 (April 2011): 1270–73. http://dx.doi.org/10.4028/www.scientific.net/amr.225-226.1270.

Texte intégral
Résumé :
In this paper, Discrete Hopfield Neural Networks with Multple Delays is introduced.And a stability criterion of Discrete Hopfield Neural Networks with Multple Delays is studied bythe linear matrix inequality. It provides a theory basis for the application of Discrete Hopfield Neural Networks with Multple Delays.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Feild, William B., and Jainendra K. Navlakha. "On Hopfield neural networks." Neural Networks 1 (January 1988): 21. http://dx.doi.org/10.1016/0893-6080(88)90063-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Kobayashi, M. "Hyperbolic Hopfield Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 24, no. 2 (2013): 335–41. http://dx.doi.org/10.1109/tnnls.2012.2230450.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Hu, Shigeng, Xiaoxin Liao, and Xuerong Mao. "Stochastic Hopfield neural networks." Journal of Physics A: Mathematical and General 36, no. 9 (2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Wilson, Robert C. "Parallel Hopfield Networks." Neural Computation 21, no. 3 (2009): 831–50. http://dx.doi.org/10.1162/neco.2008.03-07-496.

Texte intégral
Résumé :
We introduce a novel type of neural network, termed the parallel Hopfield network, that can simultaneously effect the dynamics of many different, independent Hopfield networks in parallel in the same piece of neural hardware. Numerically we find that under certain conditions, each Hopfield subnetwork has a finite memory capacity approaching that of the equivalent isolated attractor network, while a simple signal-to-noise analysis sheds qualitative, and some quantitative, insight into the workings (and failures) of the system.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Kobayashi, Masaki. "Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks." Neural Computation 32, no. 11 (2020): 2237–48. http://dx.doi.org/10.1162/neco_a_01320.

Texte intégral
Résumé :
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Comp
Styles APA, Harvard, Vancouver, ISO, etc.
7

Kobayashi, Masaki. "Storage Capacities of Twin-Multistate Quaternion Hopfield Neural Networks." Computational Intelligence and Neuroscience 2018 (November 1, 2018): 1–5. http://dx.doi.org/10.1155/2018/1275290.

Texte intégral
Résumé :
A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Ismailov, Mirxalil, Davron Ziyadullaev, Dilnoz Muhamediyeva, Rano Gazieva, Aksulu Dzholdasbaeva, and Sharofiddin Aynaqulov. "Intelligent algorithms of construction of public transport routes." E3S Web of Conferences 365 (2023): 01002. http://dx.doi.org/10.1051/e3sconf/202336501002.

Texte intégral
Résumé :
Today, in public transport planning systems, it is relevant to a search for a possible route with a minimum time. The aim of the work is the development of intelligent algorithms for constructing public transport routes, the development of programs, and the conduct of a computational experiment. Research methods are the theory of neural networks. The paper considers Hopfield neural networks and proposed recurrent neural networks. However, in Hopfield neural networks, the chances of solving this optimization problem decrease as the matrix size increases. A recurrent neural network is proposed,
Styles APA, Harvard, Vancouver, ISO, etc.
9

Chen, Jing, Xiaodi Li, and Dequan Wang. "Asymptotic Stability and Exponential Stability of Impulsive Delayed Hopfield Neural Networks." Abstract and Applied Analysis 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/638496.

Texte intégral
Résumé :
A criterion for the uniform asymptotic stability of the equilibrium point of impulsive delayed Hopfield neural networks is presented by using Lyapunov functions and linear matrix inequality approach. The criterion is a less restrictive version of a recent result. By means of constructing the extended impulsive Halanay inequality, we also analyze the exponential stability of impulsive delayed Hopfield neural networks. Some new sufficient conditions ensuring exponential stability of the equilibrium point of impulsive delayed Hopfield neural networks are obtained. An example showing the effective
Styles APA, Harvard, Vancouver, ISO, etc.
10

KOBAYASHI, Masaki. "Global Hyperbolic Hopfield Neural Networks." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E99.A, no. 12 (2016): 2511–16. http://dx.doi.org/10.1587/transfun.e99.a.2511.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
11

Leblebicioğlu, Kemal, Ugur Halici, and Okay Çelebi. "Infinite dimensional Hopfield neural networks." Nonlinear Analysis: Theory, Methods & Applications 47, no. 9 (2001): 5807–13. http://dx.doi.org/10.1016/s0362-546x(01)00710-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
12

Kobayashi, Masaki. "Symmetric quaternionic Hopfield neural networks." Neurocomputing 240 (May 2017): 110–14. http://dx.doi.org/10.1016/j.neucom.2017.02.044.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
13

Kobayashi, Masaki. "Diagonal rotor Hopfield neural networks." Neurocomputing 415 (November 2020): 40–47. http://dx.doi.org/10.1016/j.neucom.2020.07.041.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Kobayashi, Masaki. "Dual-numbered Hopfield neural networks." IEEJ Transactions on Electrical and Electronic Engineering 13, no. 2 (2017): 280–84. http://dx.doi.org/10.1002/tee.22524.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
15

Kobayashi, Masaki. "Noise Robust Projection Rule for Klein Hopfield Neural Networks." Neural Computation 33, no. 6 (2021): 1698–716. http://dx.doi.org/10.1162/neco_a_01385.

Texte intégral
Résumé :
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of
Styles APA, Harvard, Vancouver, ISO, etc.
16

Pei, Yangjun, Chao Liu, and Qi Han. "Stability of Delayed Hopfield Neural Networks with Variable-Time Impulses." Mathematical Problems in Engineering 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/154036.

Texte intégral
Résumé :
In this paper the globally exponential stability criteria of delayed Hopfield neural networks with variable-time impulses are established. The proposed criteria can also be applied in Hopfield neural networks with fixed-time impulses. A numerical example is presented to illustrate the effectiveness of our theoretical results.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Jia, Zhifu, and Cunlin Li. "Almost Sure Exponential Stability of Uncertain Stochastic Hopfield Neural Networks Based on Subadditive Measures." Mathematics 11, no. 14 (2023): 3110. http://dx.doi.org/10.3390/math11143110.

Texte intégral
Résumé :
For this paper, we consider the almost sure exponential stability of uncertain stochastic Hopfield neural networks based on subadditive measures. Firstly, we deduce two corollaries, using the Itô–Liu formula. Then, we introduce the concept of almost sure exponential stability for uncertain stochastic Hopfield neural networks. Next, we investigate the almost sure exponential stability of uncertain stochastic Hopfield neural networks, using the Lyapunov method, Liu inequality, the Liu lemma, and exponential martingale inequality. In addition, we prove two sufficient conditions for almost sure ex
Styles APA, Harvard, Vancouver, ISO, etc.
18

Kobayashi, Masaki. "Hyperbolic-Valued Hopfield Neural Networks in Synchronous Mode." Neural Computation 32, no. 9 (2020): 1685–96. http://dx.doi.org/10.1162/neco_a_01303.

Texte intégral
Résumé :
For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection
Styles APA, Harvard, Vancouver, ISO, etc.
19

V. Cojocaru, Andreea, and Stefan Balint. "Neuro-Psychological Interpretations of Mathematical Results Reported in Case of Discrete- Time Hopfield Neural Networks." Journal of Genetic Engineering and Biotechnology Research 7, no. 1 (2025): 01–09. https://doi.org/10.33140/jgebr.07.01.05.

Texte intégral
Résumé :
In this paper, for mathematical descriptions of electrical phenomena (voltage state) appearing in nervous system discrete-time Hopfield neural network is used. The equilibrium states of a discrete-time Hopfield neural network are interpreted as equilibriums of the nervous system. An equilibrium state for which the steady state is locally exponentially stable is interpreted as robust equilibrium of the nervous system. That is because after a small perturbation of the equilibrium steady state the network recover the equilibrium. A path of equilibrium states for which the steady states are locall
Styles APA, Harvard, Vancouver, ISO, etc.
20

Liao, Xiaofeng, Jun Wang, and Jinde Cao. "Global and Robust Stability of Interval Hopfield Neural Networks with Time-Varying Delays." International Journal of Neural Systems 13, no. 03 (2003): 171–82. http://dx.doi.org/10.1142/s012906570300142x.

Texte intégral
Résumé :
In this paper, we investigate the problem of global and robust stability of a class of interval Hopfield neural networks that have time-varying delays. Some criteria for the global and robust stability of such networks are derived, by means of constructing suitable Lyapunov functionals for the networks. As a by-product, for the conventional Hopfield neural networks with time-varying delays, we also obtain some new criteria for their global and asymptotic stability.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Wang, Li‐Xin, and Jerry M. Mendel. "Adaptive minimum prediction‐error deconvolution and source wavelet estimation using Hopfield neural networks." GEOPHYSICS 57, no. 5 (1992): 670–79. http://dx.doi.org/10.1190/1.1443281.

Texte intégral
Résumé :
The massively parallel processing advantage of artificial neural networks makes them suitable for hardware implementations; therefore, using artificial neural networks for seismic signal processing problems has the potential of greatly speeding up seismic signal processing. A commonly used artificial neural network—Hopfield neural network—is used to implement a new adaptive minimum prediction‐error deconvolution (AMPED) procedure which decomposes deconvolution and wavelet estimation into three subprocesses: reflectivity location detection, reflectivity magnitude estimation, and source wavelet
Styles APA, Harvard, Vancouver, ISO, etc.
22

Savin, Daniela, Sabah Alkass, and Paul Fazio. "Construction resource leveling using neural networks." Canadian Journal of Civil Engineering 23, no. 4 (1996): 917–25. http://dx.doi.org/10.1139/l96-898.

Texte intégral
Résumé :
A neural network model for construction resource leveling is developed and discussed. The model is derived by mapping an augmented Lagrangian multiplier optimization formulation of a resource leveling problem onto a discrete-time Hopfield net. The resulting neural network model consists of two main blocks. Specifically, it consists of a discrete-time Hopfield neural network block, and a control block for the adjustment of Lagrange multipliers in the augmented Lagrangian multiplier optimization, and for the computation of the new set of weights of the neural network block. An experimental verif
Styles APA, Harvard, Vancouver, ISO, etc.
23

YAN, JUN-JUH, TEH-LU LIAO, JUI-SHENG LIN, and CHAO-JUNG CHENG. "SYNCHRONIZATION CONTROL OF NEURAL NETWORKS SUBJECT TO TIME-VARYING DELAYS AND INPUT NONLINEARITY." International Journal of Bifurcation and Chaos 16, no. 12 (2006): 3643–54. http://dx.doi.org/10.1142/s0218127406017038.

Texte intégral
Résumé :
This paper investigates the synchronization problem for a particular class of neural networks subject to time-varying delays and input nonlinearity. Using the variable structure control technique, a memoryless decentralized control law is established which guarantees exponential synchronization even when input nonlinearity is present. The proposed controller is suitable for application in delayed cellular neural networks and Hopfield neural networks with no restriction on the derivative of the time-varying delays. A two-dimensional cellular neural network and a four-dimensional Hopfield neural
Styles APA, Harvard, Vancouver, ISO, etc.
24

Pu, Yi-Fei, Zhang Yi, and Ji-Liu Zhou. "Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 28, no. 10 (2017): 2319–33. http://dx.doi.org/10.1109/tnnls.2016.2582512.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
25

Fan, Tongke. "Research on Optimal CDMA Multiuser Detection Based on Stochastic Hopfield Neural Network." Recent Patents on Computer Science 12, no. 3 (2019): 233–40. http://dx.doi.org/10.2174/2213275912666181210103742.

Texte intégral
Résumé :
Background: Most of the common multi-user detection techniques have the shortcomings of large computation and slow operation. For Hopfield neural networks, there are some problems such as high-speed searching ability and parallel processing, but there are local convergence problems. Objective: The stochastic Hopfield neural network avoids local convergence by introducing noise into the state variables and then achieves the optimal detection. Methods: Based on the study of CDMA communication model, this paper presents and models the problem of multi-user detection. Then a new stochastic Hopfiel
Styles APA, Harvard, Vancouver, ISO, etc.
26

KOBAYASHI, Masaki. "Three-Dimensional Quaternionic Hopfield Neural Networks." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E100.A, no. 7 (2017): 1575–77. http://dx.doi.org/10.1587/transfun.e100.a.1575.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
27

Shouhong Wang and Hai Wang. "Password Authentication Using Hopfield Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 38, no. 2 (2008): 265–68. http://dx.doi.org/10.1109/tsmcc.2007.913901.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
28

Kobayashi, Masaki. "Multistate vector product hopfield neural networks." Neurocomputing 272 (January 2018): 425–31. http://dx.doi.org/10.1016/j.neucom.2017.07.013.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
29

Guan, Zhi-Hong, and Guanrong Chen. "On delayed impulsive Hopfield neural networks." Neural Networks 12, no. 2 (1999): 273–80. http://dx.doi.org/10.1016/s0893-6080(98)00133-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
30

Li, Qingdu, Xiao-Song Yang, and Fangyan Yang. "Hyperchaos in Hopfield-type neural networks." Neurocomputing 67 (August 2005): 275–80. http://dx.doi.org/10.1016/j.neucom.2005.02.009.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
31

Kobayashi, Masaki. "Symmetric Complex-Valued Hopfield Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 28, no. 4 (2017): 1011–15. http://dx.doi.org/10.1109/tnnls.2016.2518672.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
32

Kobayashi, Masaki. "$O(2)$ -Valued Hopfield Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 30, no. 12 (2019): 3833–38. http://dx.doi.org/10.1109/tnnls.2019.2897994.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
33

Litinskii, L. B. "High-symmetry Hopfield-type neural networks." Theoretical and Mathematical Physics 118, no. 1 (1999): 107–27. http://dx.doi.org/10.1007/bf02557200.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
34

Abaid, Mansour M., and Musbah A. Sharf. "Switching Optimization By Neural Networks." مجلة الجامعة الأسمرية: العلوم التطبيقية 3, no. 1 (2018): 31–38. http://dx.doi.org/10.59743/aujas.v3i1.1616.

Texte intégral
Résumé :
The priority switching problem is a combinatorial optimization problem searching for the best solution for frame forwarding in a switch. To solve this problem, the modified Hopfield neural network, what is able to solve the priority switching problem is introduced.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Hao, Dao He, and Liang Wu. "Global Stability Properties for Neutral-Type Hopfield Neural Networks." Applied Mechanics and Materials 232 (November 2012): 682–85. http://dx.doi.org/10.4028/www.scientific.net/amm.232.682.

Texte intégral
Résumé :
The global stability properties was discussed for the neutral-type Hopfield neural networks with discrete and distributed time-varying delays .Based on the Lyapunov functional stability analysis and the linear matrix inequality approach, a new sufficient condition was derived to assure the global stability properties of the equilibrium. The criterion improved and extended the results of literature, and has less conservative.
Styles APA, Harvard, Vancouver, ISO, etc.
36

SUKHASWAMI, M. B., P. SEETHARAMULU, and ARUN K. PUJARI. "RECOGNITION OF TELUGU CHARACTERS USING NEURAL NETWORKS." International Journal of Neural Systems 06, no. 03 (1995): 317–57. http://dx.doi.org/10.1142/s0129065795000238.

Texte intégral
Résumé :
The aim of the present work is to recognize printed and handwritten Telugu characters using artificial neural networks (ANNs). Earlier work on recognition of Telugu characters has been done using conventional pattern recognition techniques. We make an initial attempt here of using neural networks for recognition with the aim of improving upon earlier methods which do not perform effectively in the presence of noise and distortion in the characters. The Hopfield model of neural network working as an associative memory is chosen for recognition purposes initially. Due to limitation in the capaci
Styles APA, Harvard, Vancouver, ISO, etc.
37

Boykov, Ilya, Vladimir Roudnev, and Alla Boykova. "Approximate Methods for Solving Problems of Mathematical Physics on Neural Hopfield Networks." Mathematics 10, no. 13 (2022): 2207. http://dx.doi.org/10.3390/math10132207.

Texte intégral
Résumé :
A Hopfield neural network is described by a system of nonlinear ordinary differential equations. We develop a broad range of numerical schemes that are applicable for a wide range of computational problems. We review here our study on an approximate solution of the Fredholm integral equation, and linear and nonlinear singular and hypersingular integral equations, using a continuous method for solving operator equations. This method assumes that the original system is associated with a Cauchy problem for systems of ordinary differential equations on Hopfield neural networks. We present sufficie
Styles APA, Harvard, Vancouver, ISO, etc.
38

Singh, Kirti, Suju M. George, and P. Rambabu. "Use of a System of Recurrent Neural Networks for Solving the Cell Placement Problem in VLSI Design." International Journal on Artificial Intelligence Tools 06, no. 01 (1997): 15–35. http://dx.doi.org/10.1142/s0218213097000037.

Texte intégral
Résumé :
Cell placement in VLSI design is an NP-complete problem. In this paper, we have tried to solve the standard cell placement problem using the Hopfield neural network model. Furthermore, a new system of coupled recurrent neural networks, which was designed to eliminate the drawbacks of the Hopfield neural network, is introduced. The performance of Hopfield networks with discrete and graded neurons is also investigated. The energy function corresponding to the chosen representation is given and the weight matrix and the inputs needed for the network are also computed in this work. Several differe
Styles APA, Harvard, Vancouver, ISO, etc.
39

JÉZÉQUEL, C., O. NÉROT, and J. DEMONGEOT. ""DYNAMICAL CONFINEMENT" IN NEURAL NETWORKS." Journal of Biological Systems 03, no. 04 (1995): 1157–65. http://dx.doi.org/10.1142/s0218339095001040.

Texte intégral
Résumé :
Randomisation of a well known mathematical model is proposed (i.e. the Hopfield model for neural networks) in order to facilitate the study of its asymptotic behavior: in fact, we replace the determination of the stability basins for attractors and for stability boundaries by the study of a unique invariant measure, whose distribution function maxima (or respectively, percentile contour lines) correspond to the location of the attractors (or respectively, boundaries of their stability basins). We give the name of "confinement" to this localization of the mass of the invariant measure. We inten
Styles APA, Harvard, Vancouver, ISO, etc.
40

Gao, Jin, and Lihua Dai. "Anti-periodic synchronization of quaternion-valued high-order Hopfield neural networks with delays." AIMS Mathematics 7, no. 8 (2022): 14051–75. http://dx.doi.org/10.3934/math.2022775.

Texte intégral
Résumé :
<abstract><p>This paper proposes a class of quaternion-valued high-order Hopfield neural networks with delays. By using the non-decomposition method, non-reduced order method, analytical techniques in uniform convergence functions sequence, and constructing Lyapunov function, we obtain several sufficient conditions for the existence and global exponential synchronization of anti-periodic solutions for delayed quaternion-valued high-order Hopfield neural networks. Finally, an example and its numerical simulations are given to support the proposed approach. Our results play an import
Styles APA, Harvard, Vancouver, ISO, etc.
41

ANDRECUT, MIRCEA. "APPLICATIONS OF THE HOPFIELD–LITTLE NEURAL NETWORKS." Modern Physics Letters B 13, no. 12n13 (1999): 431–40. http://dx.doi.org/10.1142/s0217984999000543.

Texte intégral
Résumé :
We show that the Hopfield–Little neural networks can be successfully used for solving arbitrary set of linear equations by computing the Moore–Penrose generalized inverse of matrices. Also, an application for character recognition is described.
Styles APA, Harvard, Vancouver, ISO, etc.
42

ISOKAWA, TEIJIRO, HARUHIKO NISHIMURA, NAOTAKE KAMIURA, and NOBUYUKI MATSUI. "ASSOCIATIVE MEMORY IN QUATERNIONIC HOPFIELD NEURAL NETWORK." International Journal of Neural Systems 18, no. 02 (2008): 135–45. http://dx.doi.org/10.1142/s0129065708001440.

Texte intégral
Résumé :
Associative memory networks based on quaternionic Hopfield neural network are investigated in this paper. These networks are composed of quaternionic neurons, and input, output, threshold, and connection weights are represented in quaternions, which is a class of hypercomplex number systems. The energy function of the network and the Hebbian rule for embedding patterns are introduced. The stable states and their basins are explored for the networks with three neurons and four neurons. It is clarified that there exist at most 16 stable states, called multiplet components, as the degenerated sto
Styles APA, Harvard, Vancouver, ISO, etc.
43

Guo, Weiru, and Fang Liu. "A new criterion of asymptotic stability for Hopfield neural networks with time-varying delay." iPolytech Journal 25, no. 6 (2022): 753–61. http://dx.doi.org/10.21285/1814-3520-2021-6-753-761.

Texte intégral
Résumé :
The objective of this paper is to analyze the stability of Hopfield neural networks with time-varying delay. For the system to operate in a steady state, it is important to guarantee the stability of Hopfield neural networks with time-varying delay. The Lyapunov-Krasovsky functional method is the main method for investigating the stability of time-delayed systems. On the basis of this method, the stability of Hopfield neural networks with time-varying delay is ana-lysed. It is known that due to such factors as communication time, limited switching speed of various active devices, time delays o
Styles APA, Harvard, Vancouver, ISO, etc.
44

Qiu, Rong, Yujiao Dong, Xin Jiang, and Guangyi Wang. "Two-Neuron Based Memristive Hopfield Neural Network with Synaptic Crosstalk." Electronics 11, no. 19 (2022): 3034. http://dx.doi.org/10.3390/electronics11193034.

Texte intégral
Résumé :
Synaptic crosstalk is an important biological phenomenon that widely exists in neural networks. The crosstalk can influence the ability of neurons to control the synaptic weights, thereby causing rich dynamics of neural networks. Based on the crosstalk between synapses, this paper presents a novel two-neuron based memristive Hopfield neural network with a hyperbolic memristor emulating synaptic crosstalk. The dynamics of the neural networks with varying memristive parameters and crosstalk weights are analyzed via the phase portraits, time-domain waveforms, bifurcation diagrams, and basin of at
Styles APA, Harvard, Vancouver, ISO, etc.
45

RATCHAGIT, KREANGKRI. "ASYMPTOTIC STABILITY OF DELAY-DIFFERENCE SYSTEM OF HOPFIELD NEURAL NETWORKS VIA MATRIX INEQUALITIES AND APPLICATION." International Journal of Neural Systems 17, no. 05 (2007): 425–30. http://dx.doi.org/10.1142/s0129065707001263.

Texte intégral
Résumé :
In this paper, we derive a sufficient condition for asymptotic stability of the zero solution of delay-difference system of Hopfield neural networks in terms of certain matrix inequalities by using a discrete version of the Lyapunov second method. The result is applied to obtain new asymptotic stability condition for some class of delay-difference system such as delay-difference system of Hopfield neural networks with multiple delays in terms of certain matrix inequalities. Our results can be well suited for computational purposes.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Mahecha-Gómez, Jorge E. "ARTIFICIAL INTELLIGENCE WITH NEURAL NETWORKS NOBEL PRIZES IN PHYSICS AND CHEMISTRY 2024." MOMENTO, no. 70 (January 30, 2025): I—XXI. https://doi.org/10.15446/mo.n70.118564.

Texte intégral
Résumé :
John Joseph Hopfield began his career studying excitons in condensed matter physics, but his most important contributions were in the physics of computation and information, including his 1982 work on neural networks. Geoffrey Hinton, known as the “godfather” of artificial intelligence, laid the foundations for deep neural networks and developed the “backpropagation” method in 1986. These advances, along with Hopfield networks and the “Boltzmann machine”, constitute the beginning of artificial intelligence. David Baker is a pioneer in the design and prediction of three-dimensional protein stru
Styles APA, Harvard, Vancouver, ISO, etc.
47

Bagheri, F., N. Ghafarnia, and F. Bahrami. "Electrocardiogram (ECG) Signal Modeling and Noise Reduction Using Hopfield Neural Networks." Engineering, Technology & Applied Science Research 3, no. 1 (2013): 345–48. http://dx.doi.org/10.48084/etasr.243.

Texte intégral
Résumé :
The Electrocardiogram (ECG) signal is one of the diagnosing approaches to detect heart disease. In this study the Hopfield Neural Network (HNN) is applied and proposed for ECG signal modeling and noise reduction. The Hopfield Neural Network (HNN) is a recurrent neural network that stores the information in a dynamic stable pattern. This algorithm retrieves a pattern stored in memory in response to the presentation of an incomplete or noisy version of that pattern. Computer simulation results show that this method can successfully model the ECG signal and remove high-frequency noise.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Huang, Xia, Zhen Wang, and Yuxia Li. "Nonlinear Dynamics and Chaos in Fractional-Order Hopfield Neural Networks with Delay." Advances in Mathematical Physics 2013 (2013): 1–9. http://dx.doi.org/10.1155/2013/657245.

Texte intégral
Résumé :
A fractional-order two-neuron Hopfield neural network with delay is proposed based on the classic well-known Hopfield neural networks, and further, the complex dynamical behaviors of such a network are investigated. A great variety of interesting dynamical phenomena, including single-periodic, multiple-periodic, and chaotic motions, are found to exist. The existence of chaotic attractors is verified by the bifurcation diagram and phase portraits as well.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Kobayashi, Masaki. "Bicomplex-valued twin-hyperbolic Hopfield neural networks." Neurocomputing 434 (April 2021): 203–10. http://dx.doi.org/10.1016/j.neucom.2020.12.109.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
50

Li WAN, and Qinghua ZHOU. "Ultimate Boundedness of Stochastic Hopfield Neural Networks." Journal of Convergence Information Technology 6, no. 8 (2011): 244–48. http://dx.doi.org/10.4156/jcit.vol6.issue8.28.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!