Academic literature on the topic 'Complex-valued neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Complex-valued neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Complex-valued neural network"

1

Liao, Bolin, Lin Xiao, Jie Jin, Lei Ding, and Mei Liu. "Novel Complex-Valued Neural Network for Dynamic Complex-Valued Matrix Inversion." Journal of Advanced Computational Intelligence and Intelligent Informatics 20, no. 1 (January 19, 2016): 132–38. http://dx.doi.org/10.20965/jaciii.2016.p0132.

Full text
Abstract:
Static matrix inverse solving has been studied for many years. In this paper, we aim at solving a dynamic complex-valued matrix inverse. Specifically, based on the artful combination of a conventional gradient neural network and the recently-proposed Zhang neural network, a novel complex-valued neural network model is presented and investigated for computing the dynamic complex-valued matrix inverse in real time. A hardware implementation structure is also offered. Moreover, both theoretical analysis and simulation results substantiate the effectiveness and advantages of the proposed recurrent neural network model for dynamic complex-valued matrix inversion.
APA, Harvard, Vancouver, ISO, and other styles
2

Nitta, Tohru. "Learning Transformations with Complex-Valued Neurocomputing." International Journal of Organizational and Collective Intelligence 3, no. 2 (April 2012): 81–116. http://dx.doi.org/10.4018/joci.2012040103.

Full text
Abstract:
The ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations has been applied to the estimation of optical flows and the generation of fractal images. The complex-valued neural network has the adaptability and the generalization ability as inherent nature. This is the most different point between the ability of the 1-n-1 complex-valued neural network to learn 2D affine transformations and the standard techniques for 2D affine transformations such as the Fourier descriptor. It is important to clarify the properties of complex-valued neural networks in order to accelerate its practical applications more and more. In this paper, first, the generalization ability of the 1-n-1 complex-valued neural network which has learned complicated rotations on a 2D plane is examined experimentally and analytically. Next, the behavior of the 1-n-1 complex-valued neural network that has learned a transformation on the Steiner circles is demonstrated, and the relationship the values of the complex-valued weights after training and a linear transformation related to the Steiner circles is clarified via computer simulations. Furthermore, the relationship the weight values of the 1-n-1 complex-valued neural network learned 2D affine transformations and the learning patterns used is elucidated. These research results make it possible to solve complicated problems more simply and efficiently with 1-n-1 complex-valued neural networks. As a matter of fact, an application of the 1-n-1 type complex-valued neural network to an associative memory is presented.
APA, Harvard, Vancouver, ISO, and other styles
3

NITTA, TOHRU. "THE UNIQUENESS THEOREM FOR COMPLEX-VALUED NEURAL NETWORKS WITH THRESHOLD PARAMETERS AND THE REDUNDANCY OF THE PARAMETERS." International Journal of Neural Systems 18, no. 02 (April 2008): 123–34. http://dx.doi.org/10.1142/s0129065708001439.

Full text
Abstract:
This paper will prove the uniqueness theorem for 3-layered complex-valued neural networks where the threshold parameters of the hidden neurons can take non-zeros. That is, if a 3-layered complex-valued neural network is irreducible, the 3-layered complex-valued neural network that approximates a given complex-valued function is uniquely determined up to a finite group on the transformations of the learnable parameters of the complex-valued neural network.
APA, Harvard, Vancouver, ISO, and other styles
4

Dong, Tao, and Tingwen Huang. "Neural Cryptography Based on Complex-Valued Neural Network." IEEE Transactions on Neural Networks and Learning Systems 31, no. 11 (November 2020): 4999–5004. http://dx.doi.org/10.1109/tnnls.2019.2955165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Nitta, Tohru. "Orthogonality of Decision Boundaries in Complex-Valued Neural Networks." Neural Computation 16, no. 1 (January 1, 2004): 73–97. http://dx.doi.org/10.1162/08997660460734001.

Full text
Abstract:
This letter presents some results of an analysis on the decision boundaries of complex-valued neural networks whose weights, threshold values, input and output signals are all complex numbers. The main results may be summarized as follows. (1) A decision boundary of a single complex-valued neuron consists of two hypersurfaces that intersect orthogonally, and divides a decision region into four equal sections. The XOR problem and the detection of symmetry problem that cannot be solved with two-layered real-valued neural networks, can be solved by two-layered complex-valued neural networks with the orthogonal decision boundaries, which reveals a potent computational power of complex-valued neural nets. Furthermore, the fading equalization problem can be successfully solved by the two-layered complex-valued neural network with the highest generalization ability. (2) A decision boundary of a three-layered complex-valued neural network has the orthogonal property as a basic structure, and its two hypersurfaces approach orthogonality as all the net inputs to each hidden neuron grow. In particular, most of the decision boundaries in the three-layered complex-valued neural network inetersect orthogonally when the network is trained using Complex-BP algorithm. As a result, the orthogonality of the decision boundaries improves its generalization ability. (3) The average of the learning speed of the Complex-BP is several times faster than that of the Real-BP. The standard deviation of the learning speed of the Complex-BP is smaller than that of the Real-BP. It seems that the complex-valued neural network and the related algorithm are natural for learning complex-valued patterns for the above reasons.
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Sufang, and Mingyan Jiang. "The New Complex-Valued Wavelet Neural Network." TELKOMNIKA (Telecommunication Computing Electronics and Control) 12, no. 3 (September 1, 2014): 613. http://dx.doi.org/10.12928/telkomnika.v12i3.95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Sufang, and Mingyan Jiang. "The New Complex-Valued Wavelet Neural Network." TELKOMNIKA (Telecommunication Computing Electronics and Control) 12, no. 3 (September 1, 2014): 613. http://dx.doi.org/10.12928/v12i3.95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kobayashi, Masaki. "Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules." Computational Intelligence and Neuroscience 2017 (2017): 1–6. http://dx.doi.org/10.1155/2017/4894278.

Full text
Abstract:
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yunong, Zhan Li, and Kene Li. "Complex-valued Zhang neural network for online complex-valued time-varying matrix inversion." Applied Mathematics and Computation 217, no. 24 (August 2011): 10066–73. http://dx.doi.org/10.1016/j.amc.2011.04.085.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kobayashi, Masaki. "Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks." Neural Computation 32, no. 11 (November 2020): 2237–48. http://dx.doi.org/10.1162/neco_a_01320.

Full text
Abstract:
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Complex-valued neural network"

1

Hu, Qiong. "Statistical parametric speech synthesis based on sinusoidal models." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28719.

Full text
Abstract:
This study focuses on improving the quality of statistical speech synthesis based on sinusoidal models. Vocoders play a crucial role during the parametrisation and reconstruction process, so we first lead an experimental comparison of a broad range of the leading vocoder types. Although our study shows that for analysis / synthesis, sinusoidal models with complex amplitudes can generate high quality of speech compared with source-filter ones, component sinusoids are correlated with each other, and the number of parameters is also high and varies in each frame, which constrains its application for statistical speech synthesis. Therefore, we first propose a perceptually based dynamic sinusoidal model (PDM) to decrease and fix the number of components typically used in the standard sinusoidal model. Then, in order to apply the proposed vocoder with an HMM-based speech synthesis system (HTS), two strategies for modelling sinusoidal parameters have been compared. In the first method (DIR parameterisation), features extracted from the fixed- and low-dimensional PDM are statistically modelled directly. In the second method (INT parameterisation), we convert both static amplitude and dynamic slope from all the harmonics of a signal, which we term the Harmonic Dynamic Model (HDM), to intermediate parameters (regularised cepstral coefficients (RDC)) for modelling. Our results show that HDM with intermediate parameters can generate comparable quality to STRAIGHT. As correlations between features in the dynamic model cannot be modelled satisfactorily by a typical HMM-based system with diagonal covariance, we have applied and tested a deep neural network (DNN) for modelling features from these two methods. To fully exploit DNN capabilities, we investigate ways to combine INT and DIR at the level of both DNN modelling and waveform generation. For DNN training, we propose to use multi-task learning to model cepstra (from INT) and log amplitudes (from DIR) as primary and secondary tasks. We conclude from our results that sinusoidal models are indeed highly suited for statistical parametric synthesis. The proposed method outperforms the state-of-the-art STRAIGHT-based equivalent when used in conjunction with DNNs. To further improve the voice quality, phase features generated from the proposed vocoder also need to be parameterised and integrated into statistical modelling. Here, an alternative statistical model referred to as the complex-valued neural network (CVNN), which treats complex coefficients as a whole, is proposed to model complex amplitude explicitly. A complex-valued back-propagation algorithm using a logarithmic minimisation criterion which includes both amplitude and phase errors is used as a learning rule. Three parameterisation methods are studied for mapping text to acoustic features: RDC / real-valued log amplitude, complex-valued amplitude with minimum phase and complex-valued amplitude with mixed phase. Our results show the potential of using CVNNs for modelling both real and complex-valued acoustic features. Overall, this thesis has established competitive alternative vocoders for speech parametrisation and reconstruction. The utilisation of proposed vocoders on various acoustic models (HMM / DNN / CVNN) clearly demonstrates that it is compelling to apply them for the parametric statistical speech synthesis.
APA, Harvard, Vancouver, ISO, and other styles
2

Minin, Alexey [Verfasser]. "Modeling of Dynamical Systems with Complex Valued Recurrent Neural Networks / Alexey Minin. Gutachter: Alois Knoll ; Mark J. Embrechts. Betreuer: Alois Knoll ; Hans-Georg Zimmermann." München : Universitätsbibliothek der TU München, 2012. http://d-nb.info/1024963985/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Shu-Fan, and 王書凡. "Monaural Source Separation Based on Complex-valued Deep Neural Network." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/fyvr7y.

Full text
Abstract:
碩士
國立中央大學
資訊工程學系
104
Deep neural networks (DNNs) have become a popular means of separating a target source from a mixed signal. Almost all DNN-based methods modify only the magnitude spectrum of the mixture. The phase spectrum is left unchanged, which is inherent in the short-time Fourier transform (STFT) coefficients of the input signal. However, recent studies have revealed that incorporating phase information can improve the perceptual quality of separated sources. Accordingly, in this paper, estimating the STFT coefficients of target sources from an input mixture is regarded a regression problem. A fully complex-valued deep neural network is developed herein to learn the nonlinear mapping from complex-valued STFT coefficients of a mixture to sources. The proposed method is applied to speech separation and singing separation.
APA, Harvard, Vancouver, ISO, and other styles
4

Yu, Kuo, and 俞果. "Complex-Valued Deep Recurrent Neural Network for Singing Voice Separation." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/4waab5.

Full text
Abstract:
碩士
國立中央大學
資訊工程學系
105
Deep neural networks (DNN) have performed impressively in the processing of multimedia signals. Most DNN-based approaches were developed to handle real-valued data; very few have been designed for complex-valued data, despite their being essential for processing various types of multimedia signal. Accordingly, this work presents a complex-valued deep recurrent neural network (C-DRNN) for singing voice separation. The C-DRNN operates on the complex-valued short-time discrete Fourier transform (STFT) domain. A key aspect of the C-DRNN is that the activations and weights are complex-valued. The goal herein is to reconstruct the singing voice and the background music from a mixed signal. For error back-propagation, CR-calculus is utilized to calculate the complex-valued gradients of the objective function. To reinforce model regularity, two constraints are incorporated into the cost function of the C-DRNN. The first is an additional masking layer that ensures the sum of separated sources equals the input mixture. The second is a discriminative term that preserves the mutual difference between two separated sources. Finally, the proposed method is evaluated using the MIR-1K dataset and a singing voice separation task. Experimental results demonstrate that the proposed method outperforms the state-of-the-art DNN-based methods.
APA, Harvard, Vancouver, ISO, and other styles
5

"Dynamical analysis of complex-valued recurrent neural networks with time-delays." 2013. http://library.cuhk.edu.hk/record=b5884392.

Full text
Abstract:
Hu, Jin.
Thesis (Ph.D.)--Chinese University of Hong Kong, 2013.
Includes bibliographical references (leaves 140-153).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstracts also in Chinese.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Complex-valued neural network"

1

Hirose, Akira. Complex-Valued Neural Networks. 2nd ed. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hirose, Akira, ed. Complex-Valued Neural Networks. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hirose, Akira. Complex-Valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/978-3-540-33457-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hirose, Akira. Complex-Valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-27632-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

service), SpringerLink (Online, ed. Complex-Valued Neural Networks with Multi-Valued Neurons. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Aizenberg, Igor. Complex-Valued Neural Networks with Multi-Valued Neurons. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20353-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Suresh, Sundaram. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Suresh, Sundaram, Narasimhan Sundararajan, and Ramasamy Savitha. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-29491-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mandic, Danilo P. Complex valued nonlinear adaptive filters: Noncircularity, widely linear, and neural models. Hoboken, N.J: Wiley, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hirose, Akira. Complex-Valued Neural Networks. Springer, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Complex-valued neural network"

1

Wong, Wai Kit, Gin Chong Lee, Chu Kiong Loo, Way Soong Lim, and Raymond Lock. "Quaternionic Fuzzy Neural Network for View-Invariant Color Face Image Recognition." In Complex-Valued Neural Networks, 235–78. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Savitha, Ramasamy, Sundaram Suresh, and Narasimhan Sundarara. "Meta-Cognitive Complex-Valued Relaxation Network and Its Sequential Learning Algorithm." In Complex-Valued Neural Networks, 153–83. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Manyakov, Nikolay V., Igor Aizenberg, Nikolay Chumerin, and Marc M. Van Hulle. "Multilayer Feedforward Neural Network with Multi-Valued Neurons for Brain-Computer Interfacing." In Complex-Valued Neural Networks, 185–208. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118590072.ch8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Aizenberg, Igor. "Multilayer Feedforward Neural Network Based on Multi-Valued Neurons (MLMVN)." In Complex-Valued Neural Networks with Multi-Valued Neurons, 133–72. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20353-4_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zimmermann, Hans-Georg, Alexey Minin, and Victoria Kusherbaeva. "Historical Consistent Complex Valued Recurrent Neural Network." In Lecture Notes in Computer Science, 185–92. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21735-7_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ceylan, Murat, Nurettin Çetinkaya, Rahime Ceylan, and Yüksel Özbay. "Comparison of Complex-Valued Neural Network and Fuzzy Clustering Complex-Valued Neural Network for Load-Flow Analysis." In Artificial Intelligence and Neural Networks, 92–99. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11803089_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Suresh, Sundaram, Narasimhan Sundararajan, and Ramasamy Savitha. "Complex-valued Self-regulatory Resource Allocation Network (CSRAN)." In Supervised Learning with Complex-valued Neural Networks, 135–68. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-29491-4_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Xiao, Lin, Weiwei Meng, Rongbo Lu, Xi Yang, Bolin Liao, and Lei Ding. "A Fully Complex-Valued Neural Network for Rapid Solution of Complex-Valued Systems of Linear Equations." In Advances in Neural Networks – ISNN 2015, 444–51. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25393-0_49.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mishra, Deepak, Arvind Tolambiya, Amit Shukla, and Prem K. Kalra. "Stability Analysis for Higher Order Complex-Valued Hopfield Neural Network." In Neural Information Processing, 608–15. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11893028_68.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ikeda, Akira, and Yoshikazu Washizawa. "Spontaneous EEG Classification Using Complex Valued Neural Network." In Communications in Computer and Information Science, 495–503. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36808-1_54.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Complex-valued neural network"

1

Xiao, Jin, Yi Hu, and Shouyang Wang. "Complex-Valued GMDH-type Neural Network for Real-Valued Classification Problems." In 2013 Sixth International Conference on Business Intelligence and Financial Engineering (BIFE). IEEE, 2013. http://dx.doi.org/10.1109/bife.2013.16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lekic, Vladimir, and Zdenka Babic. "Neneta: Heterogeneous computing complex-valued neural network framework." In 2017 40th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO). IEEE, 2017. http://dx.doi.org/10.23919/mipro.2017.7973416.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yi, Qian, Lin Xiao, Yongsheng Zhang, Bolin Liao, Lei Ding, and Hua Peng. "Nonlinearly Activated Complex-Valued Gradient Neural Network for Complex Matrix Inversion." In 2018 Ninth International Conference on Intelligent Control and Information Processing (ICICIP). IEEE, 2018. http://dx.doi.org/10.1109/icicip.2018.8606673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ronghua, Ji, Zhang Shulei, Zheng Lihua, Liu Qiuxia, and Iftikhar Ahmed Saeed. "Prediction of soil moisture with complex-valued neural network." In 2017 29th Chinese Control And Decision Conference (CCDC). IEEE, 2017. http://dx.doi.org/10.1109/ccdc.2017.7978706.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Suresh, S., R. Savitha, and N. Sundararajan. "A fast learning Fully Complex-valued Relaxation Network (FCRN)." In 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033384.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Krcmar, Igor R., Petar S. Maric, and Milorad M. Bozic. "A class of neural adaptive FIR filters for complex-valued load prediction." In 2010 10th Symposium on Neural Network Applications in Electrical Engineering (NEUREL 2010). IEEE, 2010. http://dx.doi.org/10.1109/neurel.2010.5644047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Guruler, Huseyin, and Musa Peker. "A software tool for complex-valued neural network: CV-ANN." In 2015 23th Signal Processing and Communications Applications Conference (SIU). IEEE, 2015. http://dx.doi.org/10.1109/siu.2015.7130272.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Amilia, Sindi, Mahmud Dwi Sulistiyo, and Retno Novi Dayawati. "Face image-based gender recognition using complex-valued neural network." In 2015 3rd International Conference on Information and Communication Technology (ICoICT ). IEEE, 2015. http://dx.doi.org/10.1109/icoict.2015.7231422.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cevik, Hasan Huseyin, Yunus Emre Acar, and Mehmet Cunkas. "Day Ahead Wind Power Forecasting Using Complex Valued Neural Network." In 2018 International Conference on Smart Energy Systems and Technologies (SEST). IEEE, 2018. http://dx.doi.org/10.1109/sest.2018.8495637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Marseet, Akram, and Ferat Sahin. "Application of complex-valued convolutional neural network for next generation wireless networks." In 2017 IEEE Western New York Image and Signal Processing Workshop (WNYISPW). IEEE, 2017. http://dx.doi.org/10.1109/wnyipw.2017.8356260.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography