Academic literature on the topic 'Backpropagation Neural Networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Backpropagation Neural Networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Backpropagation Neural Networks"

1

Wythoff, Barry J. "Backpropagation neural networks." Chemometrics and Intelligent Laboratory Systems 18, no. 2 (1993): 115–55. http://dx.doi.org/10.1016/0169-7439(93)80052-j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Harrington, Peter de B. "Temperature-Constrained Backpropagation Neural Networks." Analytical Chemistry 66, no. 6 (1994): 802–7. http://dx.doi.org/10.1021/ac00078a007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Rosenbaum, Robert. "On the relationship between predictive coding and backpropagation." PLOS ONE 17, no. 3 (2022): e0266102. http://dx.doi.org/10.1371/journal.pone.0266102.

Full text
Abstract:
Artificial neural networks are often interpreted as abstract models of biological neuronal networks, but they are typically trained using the biologically unrealistic backpropagation algorithm and its variants. Predictive coding has been proposed as a potentially more biologically realistic alternative to backpropagation for training neural networks. This manuscript reviews and extends recent work on the mathematical relationship between predictive coding and backpropagation for training feedforward artificial neural networks on supervised learning tasks. Implications of these results for the
APA, Harvard, Vancouver, ISO, and other styles
4

Keller, James M., and Hossein Tahani. "Backpropagation neural networks for fuzzy logic." Information Sciences 62, no. 3 (1992): 205–21. http://dx.doi.org/10.1016/0020-0255(92)90016-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jong Man Cho. "Chromosome classification using backpropagation neural networks." IEEE Engineering in Medicine and Biology Magazine 19, no. 1 (2000): 28–33. http://dx.doi.org/10.1109/51.816241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Mingfeng. "Comprehensive Review of Backpropagation Neural Networks." Academic Journal of Science and Technology 9, no. 1 (2024): 150–54. http://dx.doi.org/10.54097/51y16r47.

Full text
Abstract:
The Backpropagation Neural Network (BPNN) is a deep learning model inspired by the biological neural network. Introduced in the 1980s, the BPNN quickly became a focal point in neural network research due to its outstanding learning capability and adaptability. The network structure consists of input, hidden, and output layers, and it optimizes weights through the backpropagation algorithm, widely applied in image recognition, speech processing, natural language processing, and more. The mathematical model of neurons describes the relationship between input and output, and the training process
APA, Harvard, Vancouver, ISO, and other styles
7

Saha, Sanjit Kumar. "Performance Evaluation of Neural Networks in Road Sign Recognition." International Journal of Advanced Engineering Research and Science 11, no. 2 (2024): 007–12. http://dx.doi.org/10.22161/ijaers.11.2.

Full text
Abstract:
This paper presents an in-depth study of road sign recognition techniques leveraging neural networks. Road sign recognition stands as a critical component of intelligent transportation systems, contributing to enhanced road safety and efficient traffic management. The paper focuses on exploring various neural network architectures for example, backpropagation neural network and hybrid neural network which is a combination of two neural network (backpropagation neural network and bidirectional associative memory), training methodologies, dataset considerations, and performance evaluations for a
APA, Harvard, Vancouver, ISO, and other styles
8

Kusumoputro, Benyamin, and Li Na. "Discrimination of Two-Mixture Fragrances Odor Using Artificial Odor Recognition System with Ensemble Backpropagation Neural Networks." Applied Mechanics and Materials 303-306 (February 2013): 1514–18. http://dx.doi.org/10.4028/www.scientific.net/amm.303-306.1514.

Full text
Abstract:
The human sensory test is often used to obtain the sensory quantities of odors, however, the fluctuation of results due to the experts condition can cause discrepancies among panelists. We have developed an artificial odor recognition system using a quartz resonator sensor and backpropagation neural networks as the pattern recognition system in order to eliminate the disadvantage of human panelist system. The backpropagation neural networks shows high recognition rate for single component odor, however, become very low when it is used to discriminate mixture fragrances odor. In this paper we h
APA, Harvard, Vancouver, ISO, and other styles
9

SAWITRI, MADE NITA DWI, I. WAYAN SUMARJAYA, and NI KETUT TARI TASTRAWATI. "PERAMALAN MENGGUNAKAN METODE BACKPROPAGATION NEURAL NETWORK." E-Jurnal Matematika 7, no. 3 (2018): 264. http://dx.doi.org/10.24843/mtk.2018.v07.i03.p213.

Full text
Abstract:
The purpose of the study is to forecast the price of rice in the city of Denpasar in 2017 using backpropagation neural network method. Backpropagation neural network is a model of artificial neural network by finding the optimal weight value. Artificial neural networks are information processing systems that have certain performance characteristics similar to that of human neural networks. This analysis uses time series data of rice prices in the city of Denpasar from January 2001 until December 2016. The results of this research, concludes that the lowest rice price is predicted in July 2017
APA, Harvard, Vancouver, ISO, and other styles
10

Sanubary, Iklas. "BRAIN TUMOR DETECTION USING BACKPROPAGATION NEURAL NETWORKS." Indonesian Journal of Physics and Nuclear Applications 3, no. 3 (2018): 83–88. http://dx.doi.org/10.24246/ijpna.v3i3.83-88.

Full text
Abstract:
A study of brain tumor detection has been done by making use of backpropagation neural networks with Gray Level Co-Occurrence Matrix (GLCM) feature extraction. CT-Scan images of the brain consist of 12 normal and 13 abnormal (tumor) brain images are analyzed. The preprocessing stage begins with cropping the image to a 256 x 256 pixels picture, then converting the colored images into grayscale images, and equalizing the histogram to improve the quality of the images. GLCM is used to calculate statistical features determined by 5 parameters i.e., contrast, correlation, energy and homogeneity for
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Backpropagation Neural Networks"

1

Hettinger, Christopher James. "Hyperparameters for Dense Neural Networks." BYU ScholarsArchive, 2019. https://scholarsarchive.byu.edu/etd/7531.

Full text
Abstract:
Neural networks can perform an incredible array of complex tasks, but successfully training a network is difficult because it requires us to minimize a function about which we know very little. In practice, developing a good model requires both intuition and a lot of guess-and-check. In this dissertation, we study a type of fully-connected neural network that improves on standard rectifier networks while retaining their useful properties. We then examine this type of network and its loss function from a probabilistic perspective. This analysis leads to a new rule for parameter initialization a
APA, Harvard, Vancouver, ISO, and other styles
2

Bendelac, Shiri. "Enhanced Neural Network Training Using Selective Backpropagation and Forward Propagation." Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/83714.

Full text
Abstract:
Neural networks are making headlines every day as the tool of the future, powering artificial intelligence programs and supporting technologies never seen before. However, the training of neural networks can take days or even weeks for bigger networks, and requires the use of super computers and GPUs in academia and industry in order to achieve state of the art results. This thesis discusses employing selective measures to determine when to backpropagate and forward propagate in order to reduce training time while maintaining classification performance. This thesis tests these new algorithms o
APA, Harvard, Vancouver, ISO, and other styles
3

Bonnell, Jeffrey A. "Implementation of a New Sigmoid Function in Backpropagation Neural Networks." Digital Commons @ East Tennessee State University, 2011. https://dc.etsu.edu/etd/1342.

Full text
Abstract:
This thesis presents the use of a new sigmoid activation function in backpropagation artificial neural networks (ANNs). ANNs using conventional activation functions may generalize poorly when trained on a set which includes quirky, mislabeled, unbalanced, or otherwise complicated data. This new activation function is an attempt to improve generalization and reduce overtraining on mislabeled or irrelevant data by restricting training when inputs to the hidden neurons are sufficiently small. This activation function includes a flattened, low-training region which grows or shrinks during back-pro
APA, Harvard, Vancouver, ISO, and other styles
4

Civelek, Ferda N. (Ferda Nur). "Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm." Thesis, University of North Texas, 1993. https://digital.library.unt.edu/ark:/67531/metadc278824/.

Full text
Abstract:
Representing time has been considered a general problem for artificial intelligence research for many years. More recently, the question of representing time has become increasingly important in representing human decision making process through connectionist expert systems. Because most human behaviors unfold over time, any attempt to represent expert performance, without considering its temporal nature, can often lead to incorrect results. A temporal feedforward neural network model that can be applied to a number of neural network application areas, including connectionist expert systems, h
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Yini. "Training Neural Networks with Evolutionary Algorithms for Flash Call Verification." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-283039.

Full text
Abstract:
Evolutionary algorithms have achieved great performance among a wide range of optimization problems. In this degree project, the network optimization problem has been reformulated and solved in an evolved way. A feasible evolutionary framework has been designed and implemented to train neural networks in supervised learning scenarios. Under the structure of evolutionary algorithms, a well-defined fitness function is applied to evaluate network parameters, and a carefully derived form of approximate gradients is used for updating parameters. Performance of the framework has been tested by train
APA, Harvard, Vancouver, ISO, and other styles
6

Sam, Iat Tong. "Theory of backpropagation type learning of artificial neural networks and its applications." Thesis, University of Macau, 2001. http://umaclib3.umac.mo/record=b1446702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Jianhua. "NEURAL NETWORK APPLICATIONS IN AGRICULTURAL ECONOMICS." UKnowledge, 2005. http://uknowledge.uky.edu/gradschool_diss/228.

Full text
Abstract:
Neural networks have become very important tools in many areas including economic researches. The objectives of this thesis are to examine the fundamental components, concepts and theory of neural network methods from econometric and statistic perspective, with particular focus on econometrically and statistically relevant models. In order to evaluate the relative effectiveness of econometric and neural network methods, two empirical studies are conducted by applying neural network methods in a methodological comparison fashion with traditional econometric models.Both neural networks and econo
APA, Harvard, Vancouver, ISO, and other styles
8

Batbayar, Batsukh, and S3099885@student rmit edu au. "Improving Time Efficiency of Feedforward Neural Network Learning." RMIT University. Electrical and Computer Engineering, 2009. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20090303.114706.

Full text
Abstract:
Feedforward neural networks have been widely studied and used in many applications in science and engineering. The training of this type of networks is mainly undertaken using the well-known backpropagation based learning algorithms. One major problem with this type of algorithms is the slow training convergence speed, which hinders their applications. In order to improve the training convergence speed of this type of algorithms, many researchers have developed different improvements and enhancements. However, the slow convergence problem has not been fully addressed. This thesis makes seve
APA, Harvard, Vancouver, ISO, and other styles
9

Vieira, Cristiano Ribeiro. "Forecasting financial markets with artificial neural networks." Master's thesis, Instituto Superior de Economia e Gestão, 2013. http://hdl.handle.net/10400.5/6340.

Full text
Abstract:
Mestrado em Matemática Financeira<br>Artificial Neural Networks are exible nonlinear mathematical models widely used in forecasting. This work is intended to investigate the support these models can give to nancial economists predicting prices movements of oil and gas companies listed in stock exchanges. Multilayer Perceptron models with logistic activation functions achieved better results predicting the direction of stocks returns than traditional linear regressions and better performances in companies with lower market capitalization. Furthermore, multilayer perceptron with eight hidden un
APA, Harvard, Vancouver, ISO, and other styles
10

Bastian, Michael R. "Neural Networks and the Natural Gradient." DigitalCommons@USU, 2010. https://digitalcommons.usu.edu/etd/539.

Full text
Abstract:
Neural network training algorithms have always suffered from the problem of local minima. The advent of natural gradient algorithms promised to overcome this shortcoming by finding better local minima. However, they require additional training parameters and computational overhead. By using a new formulation for the natural gradient, an algorithm is described that uses less memory and processing time than previous algorithms with comparable performance.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Backpropagation Neural Networks"

1

P, Dhawan Atam, and United States. National Aeronautics and Space Administration., eds. LVQ and backpropagation neural networks applied to NASA SSME data. National Aeronautics and Space Administration, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gaxiola, Fernando, Patricia Melin, and Fevrier Valdez. New Backpropagation Algorithm with Type-2 Fuzzy Weights for Neural Networks. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-34087-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

N, Sundararajan, and Foo Shou King, eds. Parallel implementations of backpropagation neural networks on transputers: A study of training set parallelism. World Scientific, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Billings, S. A. A comparison of the backpropagation and recursive prediction error algorithms for training neural networks. University of Sheffield, Dept. of Control Engineering, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Menke, Kurt William. Nonlinear adaptive control using backpropagating neural networks. Naval Postgraduate School, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wellington, Charles H. Backpropagation neural network for noise cancellation applied to the NUWES test ranges. Naval Postgraduate School, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gaxiola, Fernando, Patricia Melin, and Fevrier Valdez. New Backpropagation Algorithm with Type-2 Fuzzy Weights for Neural Networks. Springer International Publishing AG, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gaxiola, Fernando, Patricia Melin, and Fevrier Valdez. New Backpropagation Algorithm with Type-2 Fuzzy Weights for Neural Networks. Springer London, Limited, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Adaptive Learning Of Polynomial Networks Genetic Programming Backpropagation And Bayesian Methods. Springer, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

The roots of backpropagation: From ordered derivatives to neural networks and political forecasting. Wiley, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Backpropagation Neural Networks"

1

Rojas, Raúl. "The Backpropagation Algorithm." In Neural Networks. Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61068-4_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Silva, Fernando M., and Luís B. Almeida. "Acceleration techniques for the backpropagation algorithm." In Neural Networks. Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/3-540-52255-7_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Aggarwal, Charu. "The Backpropagation Algorithm." In Neural Networks and Deep Learning. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-29642-0_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nandy, Abhishek, and Manisha Biswas. "Backpropagation in Unity C#." In Neural Networks in Unity. Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3673-4_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Buscema, Massimo. "Supervised Artificial Neural Networks: Backpropagation Neural Networks." In Intelligent Data Mining in Law Enforcement Analytics. Springer Netherlands, 2012. http://dx.doi.org/10.1007/978-94-007-4914-6_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ye, Jong Chul. "Artificial Neural Networks and Backpropagation." In Geometry of Deep Learning. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-6046-7_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bhuyan, Bikram Pratim, Amar Ramdane-Cherif, Thipendra P. Singh, and Ravi Tomar. "Feedforward Neural Networks and Backpropagation." In Studies in Computational Intelligence. Springer Nature Singapore, 2024. https://doi.org/10.1007/978-981-97-8171-3_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

MacKay, David J. C. "Bayesian Methods for Backpropagation Networks." In Models of Neural Networks III. Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-0723-8_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Du, Ke-Lin, and M. N. S. Swamy. "Multilayer Perceptrons: Architecture and Error Backpropagation." In Neural Networks and Statistical Learning. Springer London, 2013. http://dx.doi.org/10.1007/978-1-4471-5571-3_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Du, Ke-Lin, and M. N. S. Swamy. "Multilayer Perceptrons: Architecture and Error Backpropagation." In Neural Networks and Statistical Learning. Springer London, 2019. http://dx.doi.org/10.1007/978-1-4471-7452-3_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Backpropagation Neural Networks"

1

Zheng, Ziyang, Zhengyang Duan, Hang Chen, et al. "Training Photonic Neural Networks with Dual Backpropagation." In CLEO: Science and Innovations. Optica Publishing Group, 2024. http://dx.doi.org/10.1364/cleo_si.2024.sm3m.4.

Full text
Abstract:
We report dual backpropagation training for end-to-end optimizing photonic neural networks (PNNs). We demonstrate the effectiveness of the proposed method by using diffractive and interference-based PNNs on image classification tasks under significant systematic errors.
APA, Harvard, Vancouver, ISO, and other styles
2

Birjais, Roshan, Kevin I.-Kai Wang, and Waleed Abdulla. "Training Deep Neural Networks with HSIC and Backpropagation." In 2024 Asia Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC). IEEE, 2024. https://doi.org/10.1109/apsipaasc63619.2025.10848746.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Brodsky, Stephen A., and Clark C. Guest. "Optical Matrix-Vector Implementation of Binary Valued Backpropagation." In Optical Computing. Optica Publishing Group, 1991. http://dx.doi.org/10.1364/optcomp.1991.me8.

Full text
Abstract:
Optical implementations of neural networks can combine advantages of neural network adaptive parallel processing and optical free-space connectivity. Binary valued Backpropagation1, a supervised learning algorithm related to standard Backpropagation2, significantly reduces interconnection storage and computation requirements. This implementation of binary valued Backpropagation used optical matrix-vector multiplication3 to represent the forward information flow between network layers. Previous analog optical network memory systems have been described4.
APA, Harvard, Vancouver, ISO, and other styles
4

Muniz, L. F., C. N. Lintzmayer, C. Jutten, and D. G. Fantinato. "Neuroevolutive Strategies for Topology and Weights Adaptation of Artificial Neural Networks." In Symposium on Knowledge Discovery, Mining and Learning. Sociedade Brasileira de Computação - SBC, 2022. http://dx.doi.org/10.5753/kdmile.2022.227807.

Full text
Abstract:
Among the methods for training Multilayer Perceptron networks, backpropagation is one of the most used ones on problems of supervised learning. However, it presents some limitations, such as local convergence and the a priori choice of the network topology. Another possible approach for training is to use Genetic Algorithms to optimize the weights and topology of networks, which is known as neuroevolution. In this work, we compare the efficiency of training and defining topology with a modified neuroevolution approach using two different metaheuristics with backpropagation on 5 classification
APA, Harvard, Vancouver, ISO, and other styles
5

Stork. "Is backpropagation biologically plausible?" In International Joint Conference on Neural Networks. IEEE, 1989. http://dx.doi.org/10.1109/ijcnn.1989.118705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

A. Salameh, Walid, and Mohammed A. Otair. "Online Handwritten Character Recognition Using an Optical Backpropagation Neural Networks." In InSITE 2005: Informing Science + IT Education Conference. Informing Science Institute, 2005. http://dx.doi.org/10.28945/2932.

Full text
Abstract:
There are many successful applications of Backpropagation (BP) for training multilayer neural networks. However, they have many shortcomings. Learning often takes insupportable time to converge, and it may fall into local minima at all. One of the possible remedies to escape from local minima is using a very small learning rate, but this will slow the learning process. The proposed algorithm is presented for the training of multilayer neural networks with very small learning rate, especially when using large training set size. It can apply in a generic manner for any network size that uses a b
APA, Harvard, Vancouver, ISO, and other styles
7

Fausett, D. W. "Strictly local backpropagation." In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137834.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cheng, L. M., H. L. Mak, and L. L. Cheng. "Structured backpropagation network." In 1991 IEEE International Joint Conference on Neural Networks. IEEE, 1991. http://dx.doi.org/10.1109/ijcnn.1991.170640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Godfrey, K. R. L., and Y. Attikiouzel. "Polar backpropagation [artificial neural networks]." In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Syed, Tehreem, Vijay Kakani, Xuenan Cui, and Hakil Kim. "Spiking Neural Networks Using Backpropagation." In 2021 IEEE Region 10 Symposium (TENSYMP). IEEE, 2021. http://dx.doi.org/10.1109/tensymp52854.2021.9550994.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Backpropagation Neural Networks"

1

Gage, Harmon J. Using Upper Layer Weights to Efficiently Construct and Train Feedforward Neural Networks Executing Backpropagation. Defense Technical Information Center, 2011. http://dx.doi.org/10.21236/ada545618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Felix, Nick Alonso, and Corinne Teeter. Combining Spike Time Dependent Plasticity (STDP) and Backpropagation (BP) for Robust and Data Efficient Spiking Neural Networks (SNN). Office of Scientific and Technical Information (OSTI), 2022. http://dx.doi.org/10.2172/1902866.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mu, Ruihui, and Xiaoqin Zeng. Improved Webpage Classification Technology Based on Feedforward Backpropagation Neural Network. "Prof. Marin Drinov" Publishing House of Bulgarian Academy of Sciences, 2018. http://dx.doi.org/10.7546/crabs.2018.09.11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kerr, John Patrick. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), 1992. http://dx.doi.org/10.2172/10138858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kerr, J. P. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), 1992. http://dx.doi.org/10.2172/6879460.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!