To see the other types of publications on this topic, follow the link: Complex-valued neural network.

Journal articles on the topic 'Complex-valued neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Complex-valued neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Liao, Bolin, Lin Xiao, Jie Jin, Lei Ding, and Mei Liu. "Novel Complex-Valued Neural Network for Dynamic Complex-Valued Matrix Inversion." Journal of Advanced Computational Intelligence and Intelligent Informatics 20, no. 1 (January 19, 2016): 132–38. http://dx.doi.org/10.20965/jaciii.2016.p0132.

Full text
Abstract:
Static matrix inverse solving has been studied for many years. In this paper, we aim at solving a dynamic complex-valued matrix inverse. Specifically, based on the artful combination of a conventional gradient neural network and the recently-proposed Zhang neural network, a novel complex-valued neural network model is presented and investigated for computing the dynamic complex-valued matrix inverse in real time. A hardware implementation structure is also offered. Moreover, both theoretical analysis and simulation results substantiate the effectiveness and advantages of the proposed recurrent neural network model for dynamic complex-valued matrix inversion.
APA, Harvard, Vancouver, ISO, and other styles
2

Nitta, Tohru. "Learning Transformations with Complex-Valued Neurocomputing." International Journal of Organizational and Collective Intelligence 3, no. 2 (April 2012): 81–116. http://dx.doi.org/10.4018/joci.2012040103.

Full text
Abstract:
The ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations has been applied to the estimation of optical flows and the generation of fractal images. The complex-valued neural network has the adaptability and the generalization ability as inherent nature. This is the most different point between the ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations and the standard techniques for 2D affine transformations such as the Fourier descriptor. It is important to clarify the properties of complex-valued neural networks in order to accelerate its practical applications more and more. In this paper, first, the generalization ability of the 1-n-1 complex-valued neural network which has learned complicated rotations on a 2D plane is examined experimentally and analytically. Next, the behavior of the 1-n-1 complex-valued neural network that has learned a transformation on the Steiner circles is demonstrated, and the relationship the values of the complex-valued weights after training and a linear transformation related to the Steiner circles is clarified via computer simulations. Furthermore, the relationship the weight values of the 1-n-1 complex-valued neural network learned 2D affine transformations and the learning patterns used is elucidated. These research results make it possible to solve complicated problems more simply and efficiently with 1-n-1 complex-valued neural networks. As a matter of fact, an application of the 1-n-1 type complex-valued neural network to an associative memory is presented.
APA, Harvard, Vancouver, ISO, and other styles
3

NITTA, TOHRU. "THE UNIQUENESS THEOREM FOR COMPLEX-VALUED NEURAL NETWORKS WITH THRESHOLD PARAMETERS AND THE REDUNDANCY OF THE PARAMETERS." International Journal of Neural Systems 18, no. 02 (April 2008): 123–34. http://dx.doi.org/10.1142/s0129065708001439.

Full text
Abstract:
This paper will prove the uniqueness theorem for 3-layered complex-valued neural networks where the threshold parameters of the hidden neurons can take non-zeros. That is, if a 3-layered complex-valued neural network is irreducible, the 3-layered complex-valued neural network that approximates a given complex-valued function is uniquely determined up to a finite group on the transformations of the learnable parameters of the complex-valued neural network.
APA, Harvard, Vancouver, ISO, and other styles
4

Dong, Tao, and Tingwen Huang. "Neural Cryptography Based on Complex-Valued Neural Network." IEEE Transactions on Neural Networks and Learning Systems 31, no. 11 (November 2020): 4999–5004. http://dx.doi.org/10.1109/tnnls.2019.2955165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Nitta, Tohru. "Orthogonality of Decision Boundaries in Complex-Valued Neural Networks." Neural Computation 16, no. 1 (January 1, 2004): 73–97. http://dx.doi.org/10.1162/08997660460734001.

Full text
Abstract:
This letter presents some results of an analysis on the decision boundaries of complex-valued neural networks whose weights, threshold values, input and output signals are all complex numbers. The main results may be summarized as follows. (1) A decision boundary of a single complex-valued neuron consists of two hypersurfaces that intersect orthogonally, and divides a decision region into four equal sections. The XOR problem and the detection of symmetry problem that cannot be solved with two-layered real-valued neural networks, can be solved by two-layered complex-valued neural networks with the orthogonal decision boundaries, which reveals a potent computational power of complex-valued neural nets. Furthermore, the fading equalization problem can be successfully solved by the two-layered complex-valued neural network with the highest generalization ability. (2) A decision boundary of a three-layered complex-valued neural network has the orthogonal property as a basic structure, and its two hypersurfaces approach orthogonality as all the net inputs to each hidden neuron grow. In particular, most of the decision boundaries in the three-layered complex-valued neural network inetersect orthogonally when the network is trained using Complex-BP algorithm. As a result, the orthogonality of the decision boundaries improves its generalization ability. (3) The average of the learning speed of the Complex-BP is several times faster than that of the Real-BP. The standard deviation of the learning speed of the Complex-BP is smaller than that of the Real-BP. It seems that the complex-valued neural network and the related algorithm are natural for learning complex-valued patterns for the above reasons.
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Sufang, and Mingyan Jiang. "The New Complex-Valued Wavelet Neural Network." TELKOMNIKA (Telecommunication Computing Electronics and Control) 12, no. 3 (September 1, 2014): 613. http://dx.doi.org/10.12928/telkomnika.v12i3.95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Sufang, and Mingyan Jiang. "The New Complex-Valued Wavelet Neural Network." TELKOMNIKA (Telecommunication Computing Electronics and Control) 12, no. 3 (September 1, 2014): 613. http://dx.doi.org/10.12928/v12i3.95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kobayashi, Masaki. "Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules." Computational Intelligence and Neuroscience 2017 (2017): 1–6. http://dx.doi.org/10.1155/2017/4894278.

Full text
Abstract:
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yunong, Zhan Li, and Kene Li. "Complex-valued Zhang neural network for online complex-valued time-varying matrix inversion." Applied Mathematics and Computation 217, no. 24 (August 2011): 10066–73. http://dx.doi.org/10.1016/j.amc.2011.04.085.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kobayashi, Masaki. "Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks." Neural Computation 32, no. 11 (November 2020): 2237–48. http://dx.doi.org/10.1162/neco_a_01320.

Full text
Abstract:
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
APA, Harvard, Vancouver, ISO, and other styles
11

Faijul Amin, Md, and Kazuyuki Murase. "Single-layered complex-valued neural network for real-valued classification problems." Neurocomputing 72, no. 4-6 (January 2009): 945–55. http://dx.doi.org/10.1016/j.neucom.2008.04.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Özdemir, Necati, Beyza B. İskender, and Nihal Yılmaz Özgür. "Complex valued neural network with Möbius activation function." Communications in Nonlinear Science and Numerical Simulation 16, no. 12 (December 2011): 4698–703. http://dx.doi.org/10.1016/j.cnsns.2011.03.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Xue-Zhong, Yimin Wei, and Predrag S. Stanimirović. "Complex Neural Network Models for Time-Varying Drazin Inverse." Neural Computation 28, no. 12 (December 2016): 2790–824. http://dx.doi.org/10.1162/neco_a_00866.

Full text
Abstract:
Two complex Zhang neural network (ZNN) models for computing the Drazin inverse of arbitrary time-varying complex square matrix are presented. The design of these neural networks is based on corresponding matrix-valued error functions arising from the limit representations of the Drazin inverse. Two types of activation functions, appropriate for handling complex matrices, are exploited to develop each of these networks. Theoretical results of convergence analysis are presented to show the desirable properties of the proposed complex-valued ZNN models. Numerical results further demonstrate the effectiveness of the proposed models.
APA, Harvard, Vancouver, ISO, and other styles
14

Xiao, Jin, Yanlin Jia, Xiaoyi Jiang, and Shouyang Wang. "Circular Complex-Valued GMDH-Type Neural Network for Real-Valued Classification Problems." IEEE Transactions on Neural Networks and Learning Systems 31, no. 12 (December 2020): 5285–99. http://dx.doi.org/10.1109/tnnls.2020.2966031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Fiori, Simone. "Nonlinear Complex-Valued Extensions of Hebbian Learning: An Essay." Neural Computation 17, no. 4 (April 1, 2005): 779–838. http://dx.doi.org/10.1162/0899766053429381.

Full text
Abstract:
The Hebbian paradigm is perhaps the best-known unsupervised learning theory in connectionism. It has inspired wide research activity in the artificial neural network field because it embodies some interesting properties such as locality and the capability of being applicable to the basic weight-and-sum structure of neuron models. The plain Hebbian principle, however, also presents some inherent theoretical limitations that make it impractical in most cases. Therefore, modifications of the basic Hebbian learning paradigm have been proposed over the past 20 years in order to design profitable signal and data processing algorithms. Such modifications led to the principal component analysis type class of learning rules along with their nonlinear extensions. The aim of this review is primarily to present part of the existing fragmented material in the field of principal component learning within a unified view and contextually to motivate and present extensions of previous works on Hebbian learning to complex-weighted linear neural networks. This work benefits from previous studies on linear signal decomposition by artificial neural networks, nonquadratic component optimization and reconstruction error definition, neural parameters adaptation by constrained optimization of learning criteria of complex-valued arguments, and orthonormality expression via the insertion of topological elements in the networks or by modifying the network learning criterion. In particular, the learning principles considered here and their analysis concern complex-valued principal/minor component/subspace linear/nonlinear rules for complex-weighted neural structures, both feedforward and laterally connected.
APA, Harvard, Vancouver, ISO, and other styles
16

Minin, Alexey, Alois Knoll, and Hans-Georg Zimmermann. "Complex Valued Recurrent Neural Network: From Architecture to Training." Journal of Signal and Information Processing 03, no. 02 (2012): 192–97. http://dx.doi.org/10.4236/jsip.2012.32026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Wang, Xiao, and Haipeng Wang. "Forest Height Mapping Using Complex-Valued Convolutional Neural Network." IEEE Access 7 (2019): 126334–43. http://dx.doi.org/10.1109/access.2019.2938896.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Yang, W. H., K. K. Chan, and P. R. Chang. "Complex-valued neural network for direction of arrival estimation." Electronics Letters 30, no. 7 (March 31, 1994): 574–75. http://dx.doi.org/10.1049/el:19940400.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Olanrewaju, R. F., O. O. Khalifa, Aisha Abdulla, A. A. Aburas, and A. M. Zeki. "Determining Watermark Embedding Strength using Complex Valued Neural Network." Journal of Applied Sciences 11, no. 16 (August 1, 2011): 2907–15. http://dx.doi.org/10.3923/jas.2011.2907.2915.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Huang, Chengdai, Jinde Cao, Min Xiao, Ahmed Alsaedi, and Tasawar Hayat. "Bifurcations in a delayed fractional complex-valued neural network." Applied Mathematics and Computation 292 (January 2017): 210–27. http://dx.doi.org/10.1016/j.amc.2016.07.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Grasso, Francesco, Antonio Luchetta, and Stefano Manetti. "A Multi-Valued Neuron Based Complex ELM Neural Network." Neural Processing Letters 48, no. 1 (November 6, 2017): 389–401. http://dx.doi.org/10.1007/s11063-017-9745-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Zhang, Hao, and Xing-yuan Wang. "Complex projective synchronization of complex-valued neural network with structure identification." Journal of the Franklin Institute 354, no. 12 (August 2017): 5011–25. http://dx.doi.org/10.1016/j.jfranklin.2017.05.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Ding, Lei, Lin Xiao, Kaiqing Zhou, Yonghong Lan, Yongsheng Zhang, and Jichun Li. "An Improved Complex-Valued Recurrent Neural Network Model for Time-Varying Complex-Valued Sylvester Equation." IEEE Access 7 (2019): 19291–302. http://dx.doi.org/10.1109/access.2019.2896983.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Xiao, Lin, and Rongbo Lu. "A Fully Complex-Valued Gradient Neural Network for Rapidly Computing Complex-Valued Linear Matrix Equations." Chinese Journal of Electronics 26, no. 6 (November 1, 2017): 1194–97. http://dx.doi.org/10.1049/cje.2017.06.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Guo, Liang, Guanfeng Song, and Hongsheng Wu. "Complex-Valued Pix2pix—Deep Neural Network for Nonlinear Electromagnetic Inverse Scattering." Electronics 10, no. 6 (March 22, 2021): 752. http://dx.doi.org/10.3390/electronics10060752.

Full text
Abstract:
Nonlinear electromagnetic inverse scattering is an imaging technique with quantitative reconstruction and high resolution. Compared with conventional tomography, it takes into account the more realistic interaction between the internal structure of the scene and the electromagnetic waves. However, there are still open issues and challenges due to its inherent strong non-linearity, ill-posedness and computational cost. To overcome these shortcomings, we apply an image translation network, named as Complex-Valued Pix2pix, on the inverse scattering problem of electromagnetic field. Complex-Valued Pix2pix includes two parts of Generator and Discriminator. The Generator employs a multi-layer complex valued convolutional neural network, while the Discriminator computes the maximum likelihoods between the original value and the reconstructed value from the aspects of the two parts of the complex: real part and imaginary part, respectively. The results show that the Complex-Valued Pix2pix can learn the mapping from the initial contrast to the real contrast in microwave imaging models. Moreover, due to the introduction of discriminator, Complex-Valued Pix2pix can capture more features of nonlinearity than traditional Convolutional Neural Network (CNN) by confrontation training. Therefore, without considering the time cost of training, Complex-Valued Pix2pix may be a more effective way to solve inverse scattering problems than other deep learning methods. The main improvement of this work lies in the realization of a Generative Adversarial Network (GAN) in the electromagnetic inverse scattering problem, adding a discriminator to the traditional Convolutional Neural Network (CNN) method to optimize network training. It has the prospect of outperforming conventional methods in terms of both the image quality and computational efficiency.
APA, Harvard, Vancouver, ISO, and other styles
26

Aibinu, A. M., M. J. E. Salami, and A. A. Shafie. "Determination of Complex-Valued Parametric Model Coefficients Using Artificial Neural Network Technique." Advances in Artificial Neural Systems 2010 (June 30, 2010): 1–11. http://dx.doi.org/10.1155/2010/984381.

Full text
Abstract:
A new approach for determining the coefficients of a complex-valued autoregressive (CAR) and complex-valued autoregressive moving average (CARMA) model coefficients using complex-valued neural network (CVNN) technique is discussed in this paper. The CAR and complex-valued moving average (CMA) coefficients which constitute a CARMA model are computed simultaneously from the adaptive weights and coefficients of the linear activation functions in a two-layered CVNN. The performance of the proposed technique has been evaluated using simulated complex-valued data (CVD) with three different types of activation functions. The results show that the proposed method can accurately determine the model coefficients provided that the network is properly trained. Furthermore, application of the developed CVNN-based technique for MRI K-space reconstruction results in images with improve resolution.
APA, Harvard, Vancouver, ISO, and other styles
27

Yang, Bin. "Small-time Scale Network Traffic Prediction Based on Complex-valued Neural Network." IOP Conference Series: Materials Science and Engineering 224 (July 2017): 012044. http://dx.doi.org/10.1088/1757-899x/224/1/012044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Imah, Elly Matul, Atik Wintarti, R. Sulaiman, and Manuharawati Manuharawati. "Automation animal tracker using complex value neural network." MATEC Web of Conferences 197 (2018): 03020. http://dx.doi.org/10.1051/matecconf/201819703020.

Full text
Abstract:
Animal tracker is an important phase in animal behavior analysis. It leads to understanding how, when, and why the animal use the environmental resources, how, where, and when they interact with each other, with other species, and with their environment. Understanding the animal behavior is providing the link to population distribution which is essential for predicting the human-caused environmental change and guidance for conservation strategies. Tracking and detecting the animal is time and cost consuming. Machine Learning can relieve this burden by detecting animal automatically. Complex-Valued Neural Network is a method of Machine Learning that is challenging and interesting to be explored. This study applied of Complex-Valued Neural Network (CVNN) for animal tracking, especially in detecting the animal species. The experiment results present that CVNN is robust to recognition the animal automatically.
APA, Harvard, Vancouver, ISO, and other styles
29

Nitta, Tohru. "Redundancy of the parameters of the complex-valued neural network." Neurocomputing 49, no. 1-4 (December 2002): 423–28. http://dx.doi.org/10.1016/s0925-2312(02)00669-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Savitha, R., S. Suresh, and N. Sundararajan. "Projection-Based Fast Learning Fully Complex-Valued Relaxation Neural Network." IEEE Transactions on Neural Networks and Learning Systems 24, no. 4 (April 2013): 529–41. http://dx.doi.org/10.1109/tnnls.2012.2235460.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Gao, Jingkun, Bin Deng, Yuliang Qin, Hongqiang Wang, and Xiang Li. "Enhanced Radar Imaging Using a Complex-Valued Convolutional Neural Network." IEEE Geoscience and Remote Sensing Letters 16, no. 1 (January 2019): 35–39. http://dx.doi.org/10.1109/lgrs.2018.2866567.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Yu, Lingjuan, Yuehong Hu, Xiaochun Xie, Yun Lin, and Wen Hong. "Complex-Valued Full Convolutional Neural Network for SAR Target Classification." IEEE Geoscience and Remote Sensing Letters 17, no. 10 (October 2020): 1752–56. http://dx.doi.org/10.1109/lgrs.2019.2953892.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hao, Zhang, Wang Xing-yuan, and Lin Xiao-hui. "Synchronization of complex-valued neural network with sliding mode control." Journal of the Franklin Institute 353, no. 2 (January 2016): 345–58. http://dx.doi.org/10.1016/j.jfranklin.2015.11.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Özbay, Yüksel, Sadık Kara, Fatma Latifoğlu, Rahime Ceylan, and Murat Ceylan. "Complex-valued wavelet artificial neural network for Doppler signals classifying." Artificial Intelligence in Medicine 40, no. 2 (June 2007): 143–56. http://dx.doi.org/10.1016/j.artmed.2007.02.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Ceylan, Rahime, Murat Ceylan, Yüksel Özbay, and Sadık Kara. "Fuzzy clustering complex-valued neural network to diagnose cirrhosis disease." Expert Systems with Applications 38, no. 8 (August 2011): 9744–51. http://dx.doi.org/10.1016/j.eswa.2011.02.025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Dong, Tao, Jiaqi Bai, and Lei Yang. "Bifurcation Analysis of Delayed Complex-Valued Neural Network with Diffusions." Neural Processing Letters 50, no. 2 (August 10, 2018): 1019–33. http://dx.doi.org/10.1007/s11063-018-9899-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Zhang, Chunrui, Zhenzhang Sui, and Hongpeng Li. "Equivariant bifurcation in a coupled complex-valued neural network rings." Chaos, Solitons & Fractals 98 (May 2017): 22–30. http://dx.doi.org/10.1016/j.chaos.2017.03.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

SREE HARI RAO, V., and GARIMELLA RAMA MURTHY. "GLOBAL DYNAMICS OF A CLASS OF COMPLEX VALUED NEURAL NETWORKS." International Journal of Neural Systems 18, no. 02 (April 2008): 165–71. http://dx.doi.org/10.1142/s0129065708001476.

Full text
Abstract:
In this paper activation dynamics of a complex valued neural network has been studied. Sufficient conditions for global exponential stability of a unique equilibrium are obtained. Our results show that in the serial mode of operation, the network converges to a stable state.
APA, Harvard, Vancouver, ISO, and other styles
39

Zhang, Wei, Chuandong Li, and Tingwen Huang. "Global robust stability of complex-valued recurrent neural networks with time-delays and uncertainties." International Journal of Biomathematics 07, no. 02 (March 2014): 1450016. http://dx.doi.org/10.1142/s1793524514500168.

Full text
Abstract:
This paper focuses on the existence, uniqueness and global robust stability of equilibrium point for complex-valued recurrent neural networks with multiple time-delays and under parameter uncertainties with respect to two activation functions. Two sufficient conditions for robust stability of the considered neural networks are presented and established in two new time-independent relationships between the network parameters of the neural system. Finally, three illustrative examples are given to demonstrate the theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
40

TSUZUKI, Hirofumi, Mauricio KUGLER, Susumu KUROYANAGI, and Akira IWATA. "An Approach for Sound Source Localization by Complex-Valued Neural Network." IEICE Transactions on Information and Systems E96.D, no. 10 (2013): 2257–65. http://dx.doi.org/10.1587/transinf.e96.d.2257.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Li, Jin-dong, and Nan-jing Huang. "Asymptotical Stability for a Class of Complex-Valued Projective Neural Network." Journal of Optimization Theory and Applications 177, no. 1 (February 28, 2018): 261–70. http://dx.doi.org/10.1007/s10957-018-1252-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Xia Hong and Sheng Chen. "Modeling of Complex-Valued Wiener Systems Using B-Spline Neural Network." IEEE Transactions on Neural Networks 22, no. 5 (May 2011): 818–25. http://dx.doi.org/10.1109/tnn.2011.2119328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Tan, Xiaofeng, Ming Li, Peng Zhang, Yan Wu, and Wanying Song. "Complex-Valued 3-D Convolutional Neural Network for PolSAR Image Classification." IEEE Geoscience and Remote Sensing Letters 17, no. 6 (June 2020): 1022–26. http://dx.doi.org/10.1109/lgrs.2019.2940387.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Rawat, Shubhankar, K. P. S. Rana, and Vineet Kumar. "A novel complex-valued convolutional neural network for medical image denoising." Biomedical Signal Processing and Control 69 (August 2021): 102859. http://dx.doi.org/10.1016/j.bspc.2021.102859.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Kobayashi, Masaki. "Storage Capacities of Twin-Multistate Quaternion Hopfield Neural Networks." Computational Intelligence and Neuroscience 2018 (November 1, 2018): 1–5. http://dx.doi.org/10.1155/2018/1275290.

Full text
Abstract:
A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.
APA, Harvard, Vancouver, ISO, and other styles
46

SUKSMONO, ANDRIYAN BAYU, and AKIRA HIROSE. "BEAMFORMING OF ULTRA-WIDEBAND PULSES BY A COMPLEX-VALUED SPATIO-TEMPORAL MULTILAYER NEURAL NETWORK." International Journal of Neural Systems 15, no. 01n02 (February 2005): 85–91. http://dx.doi.org/10.1142/s0129065705000128.

Full text
Abstract:
We present a neuro-beamformer of ultra-wideband (UWB) pulses employing complex-valued spatio-temporal multilayer neural network, where complex-valued backpropagation through time (CV-BPTT) is used as a learning algorithm. The system performance is evaluated with a UWB monocycle pulse. Simulation results in suppressing multiple UWB interferers and in steering to multiple desired UWB pulses, demonstrates the applicability of the proposed system.
APA, Harvard, Vancouver, ISO, and other styles
47

Savitha, R., S. Suresh, and N. Sundararajan. "Metacognitive Learning in a Fully Complex-Valued Radial Basis Function Neural Network." Neural Computation 24, no. 5 (May 2012): 1297–328. http://dx.doi.org/10.1162/neco_a_00254.

Full text
Abstract:
Recent studies on human learning reveal that self-regulated learning in a metacognitive framework is the best strategy for efficient learning. As the machine learning algorithms are inspired by the principles of human learning, one needs to incorporate the concept of metacognition to develop efficient machine learning algorithms. In this letter we present a metacognitive learning framework that controls the learning process of a fully complex-valued radial basis function network and is referred to as a metacognitive fully complex-valued radial basis function (Mc-FCRBF) network. Mc-FCRBF has two components: a cognitive component containing the FC-RBF network and a metacognitive component, which regulates the learning process of FC-RBF. In every epoch, when a sample is presented to Mc-FCRBF, the metacognitive component decides what to learn, when to learn, and how to learn based on the knowledge acquired by the FC-RBF network and the new information contained in the sample. The Mc-FCRBF learning algorithm is described in detail, and both its approximation and classification abilities are evaluated using a set of benchmark and practical problems. Performance results indicate the superior approximation and classification performance of Mc-FCRBF compared to existing methods in the literature.
APA, Harvard, Vancouver, ISO, and other styles
48

Chanthorn, Pharunyou, Grienggrai Rajchakit, Usa Humphries, Pramet Kaewmesri, Ramalingam Sriraman, and Chee Peng Lim. "A Delay-Dividing Approach to Robust Stability of Uncertain Stochastic Complex-Valued Hopfield Delayed Neural Networks." Symmetry 12, no. 5 (April 25, 2020): 683. http://dx.doi.org/10.3390/sym12050683.

Full text
Abstract:
In scientific disciplines and other engineering applications, most of the systems refer to uncertainties, because when modeling physical systems the uncertain parameters are unavoidable. In view of this, it is important to investigate dynamical systems with uncertain parameters. In the present study, a delay-dividing approach is devised to study the robust stability issue of uncertain neural networks. Specifically, the uncertain stochastic complex-valued Hopfield neural network (USCVHNN) with time delay is investigated. Here, the uncertainties of the system parameters are norm-bounded. Based on the Lyapunov mathematical approach and homeomorphism principle, the sufficient conditions for the global asymptotic stability of USCVHNN are derived. To perform this derivation, we divide a complex-valued neural network (CVNN) into two parts, namely real and imaginary, using the delay-dividing approach. All the criteria are expressed by exploiting the linear matrix inequalities (LMIs). Based on two examples, we obtain good theoretical results that ascertain the usefulness of the proposed delay-dividing approach for the USCVHNN model.
APA, Harvard, Vancouver, ISO, and other styles
49

Jianping, Deng, N. Sundararajan, and P. Saratchandran. "Complex-Valued Minimal Radial Basis Function Neural Network for Nonlinear System Identification." IFAC Proceedings Volumes 33, no. 15 (June 2000): 781–86. http://dx.doi.org/10.1016/s1474-6670(17)39847-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Mizote, Kengo, Hiroki Kishikawa, Nobuo Goto, and Shin-ichiro Yanagiya. "Optical Label Routing Processing for BPSK Labels Using Complex-Valued Neural Network." Journal of Lightwave Technology 31, no. 12 (June 2013): 1867–76. http://dx.doi.org/10.1109/jlt.2013.2261051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography