To see the other types of publications on this topic, follow the link: Conjugate Gradient Algorithm.

Journal articles on the topic 'Conjugate Gradient Algorithm'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Conjugate Gradient Algorithm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Guo, Jie, and Zhong Wan. "A new three-term conjugate gradient algorithm with modified gradient-differences for solving unconstrained optimization problems." AIMS Mathematics 8, no. 2 (2022): 2473–88. http://dx.doi.org/10.3934/math.2023128.

Full text
Abstract:
<abstract><p>Unconstrained optimization problems often arise from mining of big data and scientific computing. On the basis of a modified gradient-difference, this article aims to present a new three-term conjugate gradient algorithm to efficiently solve unconstrained optimization problems. Compared with the existing nonlinear conjugate gradient algorithms, the search directions in this algorithm are always sufficiently descent independent of any line search, as well as having conjugacy property. Using the standard Wolfe line search, global and local convergence of the proposed algorithm is proved under mild assumptions. Implementing the developed algorithm to solve 750 benchmark test problems available in the literature, it is shown that the numerical performance of this algorithm is remarkable, especially in comparison with that of the other similar efficient algorithms.</p></abstract>
APA, Harvard, Vancouver, ISO, and other styles
2

Qasim, Aseel M., Zinah F. Salih, and Basim A. Hassan. "A new conjugate gradient algorithms using conjugacy condition for solving unconstrained optimization." Indonesian Journal of Electrical Engineering and Computer Science 24, no. 3 (December 1, 2021): 1647. http://dx.doi.org/10.11591/ijeecs.v24.i3.pp1647-1653.

Full text
Abstract:
The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Zhan Jun, and Liu Li. "Implementation of Modified Conjugate Gradient Algorithm in Electromagnetic Tomography Lab System." Advanced Materials Research 655-657 (January 2013): 693–96. http://dx.doi.org/10.4028/www.scientific.net/amr.655-657.693.

Full text
Abstract:
The advantage of the electromagnetic tomography is introduced briefly. Based on conjugate gradient algorithm, modified conjugate gradient algorithm for electromagnetic tomography (EMT) is proposed, which improves quality of reconstructed image and convergence speed efficiently. In the light of the lab electromagnetic tomography system, modified conjugate gradient for reconstructing images is verified. By evaluation of image error and the relevance, regularization, Landweber, conjugate gradient and modified conjugate gradient algorithms are compared. It can draw the conclusion that for different flow models, image error and the correlation using modified conjugate gradient algorithm is superior to the others in lab EMT system.
APA, Harvard, Vancouver, ISO, and other styles
4

Ocłoń, Paweł, Stanisław Łopata, and Marzena Nowak. "Comparative study of conjugate gradient algorithms performance on the example of steady-state axisymmetric heat transfer problem." Archives of Thermodynamics 34, no. 3 (September 1, 2013): 15–44. http://dx.doi.org/10.2478/aoter-2013-0013.

Full text
Abstract:
Abstract The finite element method (FEM) is one of the most frequently used numerical methods for finding the approximate discrete point solution of partial differential equations (PDE). In this method, linear or nonlinear systems of equations, comprised after numerical discretization, are solved to obtain the numerical solution of PDE. The conjugate gradient algorithms are efficient iterative solvers for the large sparse linear systems. In this paper the performance of different conjugate gradient algorithms: conjugate gradient algorithm (CG), biconjugate gradient algorithm (BICG), biconjugate gradient stabilized algorithm (BICGSTAB), conjugate gradient squared algorithm (CGS) and biconjugate gradient stabilized algorithm with l GMRES restarts (BICGSTAB(l)) is compared when solving the steady-state axisymmetric heat conduction problem. Different values of l parameter are studied. The engineering problem for which this comparison is made is the two-dimensional, axisymmetric heat conduction in a finned circular tube.
APA, Harvard, Vancouver, ISO, and other styles
5

Sellami, Badreddine, and Mohamed Chiheb Eddine Sellami. "Global convergence of a modified Fletcher–Reeves conjugate gradient method with Wolfe line search." Asian-European Journal of Mathematics 13, no. 04 (April 4, 2019): 2050081. http://dx.doi.org/10.1142/s1793557120500813.

Full text
Abstract:
In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. we propose a modified Fletcher–Reeves (abbreviated FR) [Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154] conjugate gradient algorithm satisfying a parametrized sufficient descent condition with a parameter [Formula: see text] is proposed. The parameter [Formula: see text] is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel (abbreviated HS) [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952) 409–436] algorithm is obtained, which produces a descent search direction at every iteration that the line search satisfies the Wolfe conditions. Under appropriate conditions, we show that the modified FR method with the strong Wolfe line search is globally convergent of uniformly convex functions. We also present extensive preliminary numerical experiments to show the efficiency of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
6

Hasibuan, Eka Hayana, Surya Hendraputra, GS Achmad Daengs, and Liharman Saragih. "Comparison Fletcher-Reeves and Polak-Ribiere ANN Algorithm for Forecasting Analysis." Journal of Physics: Conference Series 2394, no. 1 (December 1, 2022): 012008. http://dx.doi.org/10.1088/1742-6596/2394/1/012008.

Full text
Abstract:
Abstract Each method and algorithm ANN has different performances depending on the algorithm used and the parameters given. The purpose of this research is to obtain the best algorithm information from the two algorithms that will be compared based on the performance value or the smallest / lowest MSE value so that it can be used as a reference and information for solving forecasting problems. The ANN algorithms compared were Conjugate Gradient Fletcher-Reeves and Conjugate Gradient Polak-Ribiere. The conjugate gradient algorithm can solve unlimited optimization problems and is much more efficient than gradient descent-based algorithms because of its faster turnaround time and less iteration. The research data used for the forecasting analysis of the two algorithms are data on the number of rural poor people in Sumatra, Indonesia. 6-10-1, 6-15-1, and 6-20-1 architectural analysis. The results showed that the Polak-Ribiere Conjugate Gradient algorithm with the 6-10-1 architecture has the best performance results and the smallest / lowest MSE value compared to the Fletcher-Reeves algorithm and two other architectures. So it can be concluded that the 6-10-1 architectural architecture with the Conjugate Gradient Polak-Ribiere algorithm can be used to solve forecasting problems because the training time to achieve convergence is not too long, and the resulting performance is quite good.
APA, Harvard, Vancouver, ISO, and other styles
7

Ahmed, Huda I., Eman T. Hamed, and Hamsa Th Saeed Chilmeran. "A Modified Bat Algorithm with Conjugate Gradient Method for Global Optimization." International Journal of Mathematics and Mathematical Sciences 2020 (June 4, 2020): 1–14. http://dx.doi.org/10.1155/2020/4795793.

Full text
Abstract:
Metaheuristic algorithms are used to solve many optimization problems. Firefly algorithm, particle swarm improvement, harmonic search, and bat algorithm are used as search algorithms to find the optimal solution to the problem field. In this paper, we have investigated and analyzed a new scaled conjugate gradient algorithm and its implementation, based on the exact Wolfe line search conditions and the restart Powell criterion. The new spectral conjugate gradient algorithm is a modification of the Birgin and Martínez method, a manner to overcome the lack of positive definiteness of the matrix defining the search direction. The preliminary computational results for a set of 30 unconstrained optimization test problems show that this new spectral conjugate gradient outperforms a standard conjugate gradient in this field and we have applied the newly proposed spectral conjugate gradient algorithm in bat algorithm to reach the lowest possible goal of bat algorithm. The newly proposed approach, namely, the directional bat algorithm (CG-BAT), has been then tested using several standard and nonstandard benchmarks from the CEC’2005 benchmark suite with five other algorithms and has been then tested using nonparametric statistical tests and the statistical test results show the superiority of the directional bat algorithm, and also we have adopted the performance profiles given by Dolan and More which show the superiority of the new algorithm (CG-BAT).
APA, Harvard, Vancouver, ISO, and other styles
8

Ahmed, Alaa Saad, Hisham M. Khudhur, and Mohammed S. Najmuldeen. "A new parameter in three-term conjugate gradient algorithms for unconstrained optimization." Indonesian Journal of Electrical Engineering and Computer Science 23, no. 1 (July 1, 2021): 338. http://dx.doi.org/10.11591/ijeecs.v23.i1.pp338-344.

Full text
Abstract:
<span>In this study, we develop a different parameter of three term conjugate gradient kind, this scheme depends principally on pure conjugacy condition (PCC), Whereas, the conjugacy condition (PCC) is an important condition in unconstrained non-linear optimization in general and in conjugate gradient methods in particular. The proposed method becomes converged, and satisfy conditions descent property by assuming some hypothesis, The numerical results display the effectiveness of the new method for solving test unconstrained non-linear optimization problems compared to other conjugate gradient algorithms such as Fletcher and Revees (FR) algorithm and three term Fletcher and Revees (TTFR) algorithm. and as shown in Table (1) from where in a number of iterations and evaluation of function and in Figures (1), (2) and (3) from where in A comparison of the number of iterations, A comparison of the number of times a function is calculated and A comparison of the time taken to perform the functions.</span>
APA, Harvard, Vancouver, ISO, and other styles
9

Anwer Mustafa, Ahmed, and Salah Gazi Shareef. "Global convergence of new three terms conjugate gradient for unconstrained optimization." General Letters in Mathematics 11, no. 1 (September 2021): 1–9. http://dx.doi.org/10.31559/glm2021.11.1.1.

Full text
Abstract:
In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient method of solving unconstrained optimization problems based on three terms and step size of cubic. Our new proposed CG method has descent condition, sufficient descent condition, conjugacy condition, and global convergence properties. Numerical comparisons with two standard conjugate gradient algorithms show that this algorithm is very effective depending on the number of iterations and the number of functions evaluated.
APA, Harvard, Vancouver, ISO, and other styles
10

Bridson, Robert, and Chen Greif. "A Multipreconditioned Conjugate Gradient Algorithm." SIAM Journal on Matrix Analysis and Applications 27, no. 4 (January 2006): 1056–68. http://dx.doi.org/10.1137/040620047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Sanmatías, S., and E. Vercher. "A Generalized Conjugate Gradient Algorithm." Journal of Optimization Theory and Applications 98, no. 2 (August 1998): 489–502. http://dx.doi.org/10.1023/a:1022653904717.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Al-Assady, N. H. "New Imperfect Conjugate Gradient Algorithm." Journal of Optimization Theory and Applications 94, no. 3 (September 1997): 747–55. http://dx.doi.org/10.1023/a:1022665403886.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Apolinario, J. A., M. L. R. De Campos, and C. P. Bernal O. "The constrained conjugate gradient algorithm." IEEE Signal Processing Letters 7, no. 12 (December 2000): 351–54. http://dx.doi.org/10.1109/97.883366.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Sahari, Mohamed Lamine, and Ilhem Djellit. "Conjugate gradient algorithm and fractals." Discrete Dynamics in Nature and Society 2006 (2006): 1–15. http://dx.doi.org/10.1155/ddns/2006/52682.

Full text
Abstract:
This work is an extension of the survey on Cayley's problem in case where the conjugate gradient method is used. We show that for certain values of parameters, this method produces beautiful fractal structures.
APA, Harvard, Vancouver, ISO, and other styles
15

Djordjevic, Snezana. "New hybrid conjugate gradient method as a convex combination of FR and PRP methods." Filomat 30, no. 11 (2016): 3083–100. http://dx.doi.org/10.2298/fil1611083d.

Full text
Abstract:
We consider a newhybrid conjugate gradient algorithm,which is obtained fromthe algorithmof Fletcher-Reeves, and the algorithmof Polak-Ribi?re-Polyak. Numerical comparisons show that the present hybrid conjugate gradient algorithm often behaves better than some known algorithms.
APA, Harvard, Vancouver, ISO, and other styles
16

Shang, Shang, Jing Bai, Xiaolei Song, Hongkai Wang, and Jaclyn Lau. "A Penalized Linear and Nonlinear Combined Conjugate Gradient Method for the Reconstruction of Fluorescence Molecular Tomography." International Journal of Biomedical Imaging 2007 (2007): 1–19. http://dx.doi.org/10.1155/2007/84724.

Full text
Abstract:
Conjugate gradient method is verified to be efficient for nonlinear optimization problems of large-dimension data. In this paper, a penalized linear and nonlinear combined conjugate gradient method for the reconstruction of fluorescence molecular tomography (FMT) is presented. The algorithm combines the linear conjugate gradient method and the nonlinear conjugate gradient method together based on a restart strategy, in order to take advantage of the two kinds of conjugate gradient methods and compensate for the disadvantages. A quadratic penalty method is adopted to gain a nonnegative constraint and reduce the illposedness of the problem. Simulation studies show that the presented algorithm is accurate, stable, and fast. It has a better performance than the conventional conjugate gradient-based reconstruction algorithms. It offers an effective approach to reconstruct fluorochrome information for FMT.
APA, Harvard, Vancouver, ISO, and other styles
17

Singarimbun, Roy Nuary, Ondra Eka Putra, N. L. W. S. R. Ginantra, and Mariana Puspa Dewi. "Backpropagation Artificial Neural Network Enhancement using Beale-Powell Approach Technique." Journal of Physics: Conference Series 2394, no. 1 (December 1, 2022): 012007. http://dx.doi.org/10.1088/1742-6596/2394/1/012007.

Full text
Abstract:
Abstract Machine learning algorithms can study existing data to perform specific tasks. One of the well-known machine learning algorithms is the backpropagation algorithm, but this algorithm often provides poor convergence speed in the training process and a long training time. The purpose of this study is to optimize the standard backpropagation algorithm using the Beale-Powell conjugate gradient algorithm so that the training time needed to achieve convergence is not too long, which later can be used as a reference and information for solving predictive problems. The Beale-Powell conjugate gradient algorithm can solve unlimited optimization problems and is much more efficient than gradient descent-based algorithms such as standard backpropagation. The research data used for the analysis were formal education participation data in Indonesia. To be trained and tested using the 7-10-1 architecture. The results showed that the Beale-Powell Conjugate Gradient algorithm could more quickly perform the training and convergence process. However, the MSE value of testing and performance is still superior to the backpropagation algorithm. So it can be concluded that for the prediction case of Formal Education Participation in Indonesia, the Conjugate Gradient Beale-Powell algorithm is good enough to optimize the performance of backpropagation standards seen from the convergence speed and training performance.
APA, Harvard, Vancouver, ISO, and other styles
18

Sun, Zhongbo, Yantao Tian, and Hongyang Li. "Two Modified Three-Term Type Conjugate Gradient Methods and Their Global Convergence for Unconstrained Optimization." Mathematical Problems in Engineering 2014 (2014): 1–9. http://dx.doi.org/10.1155/2014/394096.

Full text
Abstract:
Two modified three-term type conjugate gradient algorithms which satisfy both the descent condition and the Dai-Liao type conjugacy condition are presented for unconstrained optimization. The first algorithm is a modification of the Hager and Zhang type algorithm in such a way that the search direction is descent and satisfies Dai-Liao’s type conjugacy condition. The second simple three-term type conjugate gradient method can generate sufficient decent directions at every iteration; moreover, this property is independent of the steplength line search. Also, the algorithms could be considered as a modification of the MBFGS method, but with differentzk. Under some mild conditions, the given methods are global convergence, which is independent of the Wolfe line search for general functions. The numerical experiments show that the proposed methods are very robust and efficient.
APA, Harvard, Vancouver, ISO, and other styles
19

Hafaidia, I., H. Guebbai, M. Al-Baali, and M. Ghiat. "A new hybrid conjugate gradient algorithm for unconstrained optimization." Vestnik Udmurtskogo Universiteta. Matematika. Mekhanika. Komp'yuternye Nauki 33, no. 2 (June 2023): 348–64. http://dx.doi.org/10.35634/vm230211.

Full text
Abstract:
It is well known that conjugate gradient methods are useful for solving large-scale unconstrained nonlinear optimization problems. In this paper, we consider combining the best features of two conjugate gradient methods. In particular, we give a new conjugate gradient method, based on the hybridization of the useful DY (Dai-Yuan), and HZ (Hager-Zhang) methods. The hybrid parameters are chosen such that the proposed method satisfies the conjugacy and sufficient descent conditions. It is shown that the new method maintains the global convergence property of the above two methods. The numerical results are described for a set of standard test problems. It is shown that the performance of the proposed method is better than that of the DY and HZ methods in most cases.
APA, Harvard, Vancouver, ISO, and other styles
20

Ding, Lei, Yong Jun Luo, Yang Yang Wang, Zheng Li, and Bing Yin Yao. "Improved Method of Hybrid Genetic Algorithm." Applied Mechanics and Materials 556-562 (May 2014): 4014–17. http://dx.doi.org/10.4028/www.scientific.net/amm.556-562.4014.

Full text
Abstract:
On account of low convergence of the traditional genetic algorithm in the late,a hybrid genetic algorithm based on conjugate gradient method and genetic algorithm is proposed.This hybrid algorithm takes advantage of Conjugate Gradient’s certainty, but also the use of genetic algorithms in order to avoid falling into local optimum, so it can quickly converge to the exact global optimal solution. Using Two test functions for testing, shows that performance of this hybrid genetic algorithm is better than single conjugate gradient method and genetic algorithm and have achieved good results.
APA, Harvard, Vancouver, ISO, and other styles
21

Taher, Mardeen Sh, and Salah G. Shareef. "A Combined Conjugate Gradient Quasi-Newton Method with Modification BFGS Formula." International Journal of Analysis and Applications 21 (April 3, 2023): 31. http://dx.doi.org/10.28924/2291-8639-21-2023-31.

Full text
Abstract:
The conjugate gradient and Quasi-Newton methods have advantages and drawbacks, as although quasi-Newton algorithm has more rapid convergence than conjugate gradient, they require more storage compared to conjugate gradient algorithms. In 1976, Buckley designed a method that combines the CG method with QN updates, which is better than that observed for conjugate gradient algorithms but not as good as the quasi-Newton approach. This type of method is called the preconditioned conjugate gradient (PCG) method. In this paper, we introduce two new preconditioned conjugate gradient (PCG) methods that combine conjugate gradient with a new update of quasi-Newton methods. The new quasi-Newton method satisfied the positive define, and the direction of the new preconditioned conjugate gradient is descent direction. In numerical results, it is showing the new preconditioned conjugate gradient method is more effective on several high-dimension test problems than standard preconditioning.
APA, Harvard, Vancouver, ISO, and other styles
22

Cao, Junyue, Jinzhao Wu, and Wenjie Liu. "A Descent Conjugate Gradient Algorithm for Optimization Problems and Its Applications in Image Restoration and Compression Sensing." Mathematical Problems in Engineering 2020 (September 29, 2020): 1–9. http://dx.doi.org/10.1155/2020/6157294.

Full text
Abstract:
It is well known that the nonlinear conjugate gradient algorithm is one of the effective algorithms for optimization problems since it has low storage and simple structure properties. This motivates us to make a further study to design a modified conjugate gradient formula for the optimization model, and this proposed conjugate gradient algorithm possesses several properties: (1) the search direction possesses not only the gradient value but also the function value; (2) the presented direction has both the sufficient descent property and the trust region feature; (3) the proposed algorithm has the global convergence for nonconvex functions; (4) the experiment is done for the image restoration problems and compression sensing to prove the performance of the new algorithm.
APA, Harvard, Vancouver, ISO, and other styles
23

Yang, Xiangfei, Zhijun Luo, and Xiaoyu Dai. "A Global Convergence of LS-CD Hybrid Conjugate Gradient Method." Advances in Numerical Analysis 2013 (October 22, 2013): 1–5. http://dx.doi.org/10.1155/2013/517452.

Full text
Abstract:
Conjugate gradient method is one of the most effective algorithms for solving unconstrained optimization problem. In this paper, a modified conjugate gradient method is presented and analyzed which is a hybridization of known LS and CD conjugate gradient algorithms. Under some mild conditions, the Wolfe-type line search can guarantee the global convergence of the LS-CD method. The numerical results show that the algorithm is efficient.
APA, Harvard, Vancouver, ISO, and other styles
24

Chen, Yu, Jun Cao, and De Yun Chen. "A Novel Preconditioned Conjugate Gradient Image Reconstruction Algorithm for Electrical Capacitance Tomography System." Advanced Materials Research 181-182 (January 2011): 629–35. http://dx.doi.org/10.4028/www.scientific.net/amr.181-182.629.

Full text
Abstract:
To solve the‘soft-field’nature and the ill-posed problem in electrical capacitance tomography (ECT)technology, a novel preconditioned conjugate gradient algorithm for electrical capacitance tomography is presented. The analysis of the basic principles of electrical capacitance tomography, based on the given preconditioned conjugate gradient method and calculation formula of the iteration steps of the ECT and to explore the feasibility of application of the algorithm, the algorithm to meet the convergence conditions and the reconstruction image error. Experimental results and simulation data indicate that the algorithm can provide high quality images and favorable stabilization compared with LBP,conjugate gradient algorithms in simple flow pattern and this new algorithm presents a feasible and effective way to research on image reconstruction algorithm for Electrical Capacitance Tomography System.
APA, Harvard, Vancouver, ISO, and other styles
25

Wang, Zhan, Pengyuan Li, Xiangrong Li, and Hongtruong Pham. "A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems." Mathematical Problems in Engineering 2020 (September 4, 2020): 1–14. http://dx.doi.org/10.1155/2020/4381515.

Full text
Abstract:
Conjugate gradient methods are well-known methods which are widely applied in many practical fields. CD conjugate gradient method is one of the classical types. In this paper, a modified three-term type CD conjugate gradient algorithm is proposed. Some good features are presented as follows: (i) A modified three-term type CD conjugate gradient formula is presented. (ii) The given algorithm possesses sufficient descent property and trust region property. (iii) The algorithm has global convergence with the modified weak Wolfe–Powell (MWWP) line search technique and projection technique for general function. The new algorithm has made great progress in numerical experiments. It shows that the modified three-term type CD conjugate gradient method is more competitive than the classical CD conjugate gradient method.
APA, Harvard, Vancouver, ISO, and other styles
26

Hind H. Mohammed. "Generalized Dai-Yuan conjugate gradient algorithm for training multi-layer feed-forward neural networks." Tikrit Journal of Pure Science 24, no. 1 (March 18, 2019): 115–20. http://dx.doi.org/10.25130/tjps.v24i1.341.

Full text
Abstract:
In this paper, we will present different type of CG algorithms depending on Peary conjugacy condition. The new conjugate gradient training (GDY) algorithm using to train MFNNs and prove it's descent property and global convergence for it and then we tested the behavior of this algorithm in the training of artificial neural networks and compared it with known algorithms in this field through two types of issues.
APA, Harvard, Vancouver, ISO, and other styles
27

HUSSEIN1, SHAHER QAHTAN, GHASSAN EZZULDDIN ARIF1, and YOKSAL ABDLL SATTAR2. "Anew Conjugate Gradient Algorithm Based on The (Dai-Liao) Conjugate Gradient Method." Tikrit Journal of Pure Science 25, no. 1 (February 2, 2020): 128. http://dx.doi.org/10.25130/j.v25i1.945.

Full text
Abstract:
In this paper we can derive a new search direction of conjugating gradient method associated with (Dai-Liao method ) the new algorithm becomes converged by assuming some hypothesis. We are also able to prove the Descent property for the new method, numerical results showed for the proposed method is effective comparing with the (FR, HS and DY) methods. http://dx.doi.org/10.25130/tjps.25.2020.019
APA, Harvard, Vancouver, ISO, and other styles
28

SHAHER QAHTAN HUSSEIN, GHASSAN EZZULDDIN ARIF, and YOKSAL ABDLL SATTAR. "Anew Conjugate Gradient Algorithm Based on The (Dai-Liao) Conjugate Gradient Method." Tikrit Journal of Pure Science 25, no. 1 (February 2, 2020): 128–33. http://dx.doi.org/10.25130/tjps.v25i1.222.

Full text
Abstract:
In this paper we can derive a new search direction of conjugating gradient method associated with (Dai-Liao method ) the new algorithm becomes converged by assuming some hypothesis. We are also able to prove the Descent property for the new method, numerical results showed for the proposed method is effective comparing with the (FR, HS and DY) methods.
APA, Harvard, Vancouver, ISO, and other styles
29

Arthur, C. K., V. A. Temeng, and Y. Y. Ziggah. "Performance Evaluation of Training Algorithms in Backpropagation Neural Network Approach to Blast-Induced Ground Vibration Prediction." Ghana Mining Journal 20, no. 1 (July 7, 2020): 20–33. http://dx.doi.org/10.4314/gm.v20i1.3.

Full text
Abstract:
Abstract Backpropagation Neural Network (BPNN) is an artificial intelligence technique that has seen several applications in many fields of science and engineering. It is well-known that, the critical task in developing an effective and accurate BPNN model depends on an appropriate training algorithm, transfer function, number of hidden layers and number of hidden neurons. Despite the numerous contributing factors for the development of a BPNN model, training algorithm is key in achieving optimum BPNN model performance. This study is focused on evaluating and comparing the performance of 13 training algorithms in BPNN for the prediction of blast-induced ground vibration. The training algorithms considered include: Levenberg-Marquardt, Bayesian Regularisation, Broyden–Fletcher–Goldfarb–Shanno (BFGS) Quasi-Newton, Resilient Backpropagation, Scaled Conjugate Gradient, Conjugate Gradient with Powell/Beale Restarts, Fletcher-Powell Conjugate Gradient, Polak-Ribiére Conjugate Gradient, One Step Secant, Gradient Descent with Adaptive Learning Rate, Gradient Descent with Momentum, Gradient Descent, and Gradient Descent with Momentum and Adaptive Learning Rate. Using ranking values for the performance indicators of Mean Squared Error (MSE), correlation coefficient (R), number of training epoch (iteration) and the duration for convergence, the performance of the various training algorithms used to build the BPNN models were evaluated. The obtained overall ranking results showed that the BFGS Quasi-Newton algorithm outperformed the other training algorithms even though the Levenberg Marquardt algorithm was found to have the best computational speed and utilised the smallest number of epochs. Keywords: Artificial Intelligence, Blast-induced Ground Vibration, Backpropagation Training Algorithms
APA, Harvard, Vancouver, ISO, and other styles
30

Cho, Kar Mun, Nur Haizum Abd Rahman, and Iszuanie Syafidza Che Ilias. "Performance of Levenberg-Marquardt Neural Network Algorithm in Air Quality Forecasting." Sains Malaysiana 51, no. 8 (August 31, 2021): 2645–54. http://dx.doi.org/10.17576/jsm-2022-5108-23.

Full text
Abstract:
Levenberg-Marquardt algorithm and conjugate gradient method are frequently used for optimization in multi-layer perceptron (MLP). However, both algorithms have mixed conclusions in optimizing MLP in time series forecasting. This study uses autoregressive integrated moving average (ARIMA) and MLP with both Levenberg-Marquardt algorithm and conjugate gradient method. These methods were used to predict the Air Pollutant Index (API) in Malaysia's central region where represent urban and residential areas. The performances were discussed and compared using the mean square error (MSE) and mean absolute percentage error (MAPE). The result shows that MLP models have outperformed ARIMA models where MLP with Levenberg-Marquardt algorithm outperformed the conjugate gradient method.
APA, Harvard, Vancouver, ISO, and other styles
31

Lyn Dee, Goh, Norhisham Bakhary, Azlan Abdul Rahman, and Baderul Hisham Ahmad. "A Comparison of Artificial Neural Network Learning Algorithms for Vibration-Based Damage Detection." Advanced Materials Research 163-167 (December 2010): 2756–60. http://dx.doi.org/10.4028/www.scientific.net/amr.163-167.2756.

Full text
Abstract:
This paper investigates the performance of Artificial Neural Network (ANN) learning algorithms for vibration-based damage detection. The capabilities of six different learning algorithms in detecting damage are studied and their performances are compared. The algorithms are Levenberg-Marquardt (LM), Resilient Backpropagation (RP), Scaled Conjugate Gradient (SCG), Conjugate Gradient with Powell-Beale Restarts (CGB), Polak-Ribiere Conjugate Gradient (CGP) and Fletcher-Reeves Conjugate Gradient (CGF) algorithms. The performances of these algorithms are assessed based on their generalisation capability in relating the vibration parameters (frequencies and mode shapes) with damage locations and severities under various numbers of input and output variables. The results show that Levenberg-Marquardt algorithm provides the best generalisation performance.
APA, Harvard, Vancouver, ISO, and other styles
32

Spillane, Nicole. "An Adaptive MultiPreconditioned Conjugate Gradient Algorithm." SIAM Journal on Scientific Computing 38, no. 3 (January 2016): A1896—A1918. http://dx.doi.org/10.1137/15m1028534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Zhang, Jianhua, Hua Dai, and Jing Zhao. "Generalized global conjugate gradient squared algorithm." Applied Mathematics and Computation 216, no. 12 (August 2010): 3694–706. http://dx.doi.org/10.1016/j.amc.2010.05.026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Cloutier, J. R., and R. F. Wilson. "Periodically preconditioned conjugate gradient-restoration algorithm." Journal of Optimization Theory and Applications 70, no. 1 (July 1991): 79–95. http://dx.doi.org/10.1007/bf00940505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Alnowibet, Khalid Abdulaziz, Salem Mahdi, Ahmad M. Alshamrani, Karam M. Sallam, and Ali Wagdy Mohamed. "A Family of Hybrid Stochastic Conjugate Gradient Algorithms for Local and Global Minimization Problems." Mathematics 10, no. 19 (October 1, 2022): 3595. http://dx.doi.org/10.3390/math10193595.

Full text
Abstract:
This paper contains two main parts, Part I and Part II, which discuss the local and global minimization problems, respectively. In Part I, a fresh conjugate gradient (CG) technique is suggested and then combined with a line-search technique to obtain a globally convergent algorithm. The finite difference approximations approach is used to compute the approximate values of the first derivative of the function f. The convergence analysis of the suggested method is established. The comparisons between the performance of the new CG method and the performance of four other CG methods demonstrate that the proposed CG method is promising and competitive for finding a local optimum point. In Part II, three formulas are designed by which a group of solutions are generated. This set of random formulas is hybridized with the globally convergent CG algorithm to obtain a hybrid stochastic conjugate gradient algorithm denoted by HSSZH. The HSSZH algorithm finds the approximate value of the global solution of a global optimization problem. Five combined stochastic conjugate gradient algorithms are constructed. The performance profiles are used to assess and compare the rendition of the family of hybrid stochastic conjugate gradient algorithms. The comparison results between our proposed HSSZH algorithm and four other hybrid stochastic conjugate gradient techniques demonstrate that the suggested HSSZH method is competitive with, and in all cases superior to, the four algorithms in terms of the efficiency, reliability and effectiveness to find the approximate solution of the global optimization problem that contains a non-convex function.
APA, Harvard, Vancouver, ISO, and other styles
36

H. Mohammed, Hind. "Generalized Dai-Yuan conjugate gradient algorithm for training multi-layer feed-forward neural networks." Tikrit Journal of Pure Science 24, no. 1 (March 18, 2019): 115. http://dx.doi.org/10.25130/j.v24i1.789.

Full text
Abstract:
In this paper, we will present different type of CG algorithms depending on Peary conjugacy condition. The new conjugate gradient training (GDY) algorithm using to train MFNNs and prove it's descent property and global convergence for it and then we tested the behavior of this algorithm in the training of artificial neural networks and compared it with known algorithms in this field through two types of issues http://dx.doi.org/10.25130/tjps.24.2019.020
APA, Harvard, Vancouver, ISO, and other styles
37

Scales, John A. "Tomographic inversion via the conjugate gradient method." GEOPHYSICS 52, no. 2 (February 1987): 179–85. http://dx.doi.org/10.1190/1.1442293.

Full text
Abstract:
Tomographic inversion of seismic traveltime residuals is now an established and widely used technique for imaging the Earth’s interior. This inversion procedure results in large, but sparse, rectangular systems of linear algebraic equations; in practice there may be tens or even hundreds of thousands of simultaneous equations. This paper applies the classic conjugate gradient algorithm of Hestenes and Stiefel to the least‐squares solution of large, sparse systems of traveltime equations. The conjugate gradient method is fast, accurate, and easily adapted to take advantage of the sparsity of the matrix. The techniques necessary for manipulating sparse matrices are outlined in the Appendix. In addition, the results of the conjugate gradient algorithm are compared to results from two of the more widely used tomographic inversion algorithms.
APA, Harvard, Vancouver, ISO, and other styles
38

Iiduka, Hideaki, and Yu Kobayashi. "Training Deep Neural Networks Using Conjugate Gradient-like Methods." Electronics 9, no. 11 (November 2, 2020): 1809. http://dx.doi.org/10.3390/electronics9111809.

Full text
Abstract:
The goal of this article is to train deep neural networks that accelerate useful adaptive learning rate optimization algorithms such as AdaGrad, RMSProp, Adam, and AMSGrad. To reach this goal, we devise an iterative algorithm combining the existing adaptive learning rate optimization algorithms with conjugate gradient-like methods, which are useful for constrained optimization. Convergence analyses show that the proposed algorithm with a small constant learning rate approximates a stationary point of a nonconvex optimization problem in deep learning. Furthermore, it is shown that the proposed algorithm with diminishing learning rates converges to a stationary point of the nonconvex optimization problem. The convergence and performance of the algorithm are demonstrated through numerical comparisons with the existing adaptive learning rate optimization algorithms for image and text classification. The numerical results show that the proposed algorithm with a constant learning rate is superior for training neural networks.
APA, Harvard, Vancouver, ISO, and other styles
39

Dam, Hai Huyen, and Sven Nordholm. "Accelerated Conjugate Gradient for Second-Order Blind Signal Separation." Acoustics 4, no. 4 (November 11, 2022): 948–57. http://dx.doi.org/10.3390/acoustics4040058.

Full text
Abstract:
This paper proposes a new adaptive algorithm for the second-order blind signal separation (BSS) problem with convolutive mixtures by utilising a combination of an accelerated gradient and a conjugate gradient method. For each iteration of the adaptive algorithm, the search point and the search direction are obtained based on the current and the previous iterations. The algorithm efficiently calculates the step size for the accelerated conjugate gradient algorithm in each iteration. Simulation results show that the proposed accelerated conjugate gradient algorithm with optimal step size converges faster than the accelerated descent algorithm and the steepest descent algorithm with optimal step size while having lower computational complexity. In particular, the number of iterations required for convergence of the accelerated conjugate gradient algorithm is significantly lower than the accelerated descent algorithm and the steepest descent algorithm. In addition, the proposed system achieves improvement in terms of the signal to interference ratio and signal to noise ratio for the dominant speech outputs.
APA, Harvard, Vancouver, ISO, and other styles
40

Alhawarat, Ahmad, Ghaliah Alhamzi, Ibitsam Masmali, and Zabidin Salleh. "A Descent Four-Term Conjugate Gradient Method with Global Convergence Properties for Large-Scale Unconstrained Optimisation Problems." Mathematical Problems in Engineering 2021 (August 14, 2021): 1–14. http://dx.doi.org/10.1155/2021/6219062.

Full text
Abstract:
The conjugate gradient method is a useful method to solve large-scale unconstrained optimisation problems and to be used in some applications in several fields such as engineering, medical science, image restorations, neural network, and many others. The main benefit of the conjugate gradient method is not using the second derivative or its approximation, such as Newton’s method or its approximation. Moreover, the algorithm of the conjugate gradient method is simple and easy to apply. This study proposes a new modified conjugate gradient method that contains four terms depending on popular two- and three-term conjugate gradient methods. The new algorithm satisfies the descent condition. In addition, the new CG algorithm possesses the convergence property. In the numerical results part, we compare the new algorithm with famous methods such as CG-Descent. We conclude from numerical results that the new algorithm is more efficient than other popular CG methods such as CG-Descent 6.8 in terms of number of function evaluations, number of gradient evaluations, number of iterations, and CPU time.
APA, Harvard, Vancouver, ISO, and other styles
41

Khalil K. Abbo and Nazar K. Hussein. "A New Parameterized Conjugate Gradient Method based on Generalized Perry Conjugate Gradient Method." Tikrit Journal of Pure Science 21, no. 1 (February 4, 2023): 102–6. http://dx.doi.org/10.25130/tjps.v21i1.958.

Full text
Abstract:
A New Parameterized Conjugate Gradient Method based on Generalized Perry Conjugate Gradient Method is proposed to be based on Perry's idea, the descent condition and the global convergent is proven under Wolfe condition. The new algorithm is very effective for solve the large-scale unconstrained optimization problem
APA, Harvard, Vancouver, ISO, and other styles
42

Khaleel, Layth Riyadh, and Ban Ahmed Mitras. "A Novel Hybrid Dragonfly Algorithm with Modified Conjugate Gradient Method." International Journal of Computer Networks and Communications Security 8, no. 2 (February 29, 2020): 17–25. http://dx.doi.org/10.47277/ijcncs/8(2)2.

Full text
Abstract:
Dragonfly Algorithm (DA) is a meta-heuristic algorithm, It is a new algorithm proposed by Mirjalili in (2015) and it simulate the behavior of dragonflies in their search for food and migration. In this paper, a modified conjugate gradient algorithm is proposed by deriving new conjugate coefficient. The sufficient descent and the global convergence properties for the proposed algorithm are proved. Novel hybrid algorithm of the dragonfly (DA) was proposed with modified conjugate gradient Algorithm which develops the elementary society that is randomly generated as the primary society for the dragonfly optimization algorithm using the characteristics of the modified conjugate gradient algorithm. The efficiency of the hybrid algorithm was measured by applying it to (10) of the optimization functions of high measurement with different dimensions and the results of the hybrid algorithm were very good in comparison with the original algorithm
APA, Harvard, Vancouver, ISO, and other styles
43

Mustafa, Ahmed Anwer. "Conjugated Gradient with Four Terms for Nonlinear Unconstrained Optimization." General Letters in Mathematics 12, no. 1 (March 2022): 40–48. http://dx.doi.org/10.31559/glm2022.12.1.5.

Full text
Abstract:
The nonlinear conjugate gradient (GJG) technique is an effective tool for addressing minimization on a huge scale. It can be used in a variety of applications., We presented a novel conjugate gradient approach based on two hypotheses, and we equalized the two hypotheses and retrieved the good parameter in this article. To get a new conjugated gradient, we multiplied the new parameter by a control parameter and substituted it in the second equation. a fresh equation for 𝛽𝑘 is proposed. It has global convergence qualities. When compared to the two most common conjugate gradient techniques, our algorithm outperforms them in terms of both the number of iterations (NOIS) and the number of functions (NOFS). The new technique is efficient in real computing and superior to previous comparable approaches in many instances, according to numerical results.
APA, Harvard, Vancouver, ISO, and other styles
44

HUANG, SHUAI, ZHONG WAN, and SONGHAI DENG. "A MODIFIED PROJECTED CONJUGATE GRADIENT ALGORITHM FOR UNCONSTRAINED OPTIMIZATION PROBLEMS." ANZIAM Journal 54, no. 3 (January 2013): 143–52. http://dx.doi.org/10.1017/s1446181113000084.

Full text
Abstract:
AbstractWe propose a modified projected Polak–Ribière–Polyak (PRP) conjugate gradient method, where a modified conjugacy condition and a method which generates sufficient descent directions are incorporated into the construction of a suitable conjugacy parameter. It is shown that the proposed method is a modification of the PRP method and generates sufficient descent directions at each iteration. With an Armijo-type line search, the theory of global convergence is established under two weak assumptions. Numerical experiments are employed to test the efficiency of the algorithm in solving some benchmark test problems available in the literature. The numerical results obtained indicate that the algorithm outperforms an existing similar algorithm in requiring fewer function evaluations and fewer iterations to find optimal solutions with the same tolerance.
APA, Harvard, Vancouver, ISO, and other styles
45

Jiang, Zhonghua, and Ning Xu. "HotSpot Thermal Floorplan Solver Using Conjugate Gradient to Speed Up." Mobile Information Systems 2018 (2018): 1–8. http://dx.doi.org/10.1155/2018/2921451.

Full text
Abstract:
We proposed to use the conjugate gradient method to effectively solve the thermal resistance model in HotSpot thermal floorplan tool. The iterative conjugate gradient solver is suitable for traditional sparse matrix linear systems. We also defined the relative sparse matrix in the iterative thermal floorplan of Simulated Annealing framework algorithm, and the iterative method of relative sparse matrix could be applied to other iterative framework algorithms. The experimental results show that the running time of our incremental iterative conjugate gradient solver is speeded up approximately 11x compared with the LU decomposition method for case ami49, and the experiment ratio curve shows that our iterative conjugate gradient solver accelerated more with increasing number of modules.
APA, Harvard, Vancouver, ISO, and other styles
46

A. Hassan, Basim, and Haneen A. Alashoor. "On image restoration problems using new conjugate gradient methods." Indonesian Journal of Electrical Engineering and Computer Science 29, no. 3 (March 1, 2023): 1438. http://dx.doi.org/10.11591/ijeecs.v29.i3.pp1438-1445.

Full text
Abstract:
<span lang="EN-US">The nonlinear conjugate gradient algorithm is one of the effective algorithms for optimization since it has low storage and simple structure properties. The coefficient conjugate is the basis of conjugate gradient algorithms with the desirable conjugate property. In this manuscript, we have derived a new second order information for the Hessian from objective function, which can give a new search direction. Based on new search direction, we have proposed the update formula interesting and nonlinear conjugate gradient method. Under wolfe line search and mild assumptions on objective function, the method possess sufficient descent property and are always globally convergent. Numerical results show that the method is effective and competitive to recover the original image from an image corrupted by impulse noise.</span>
APA, Harvard, Vancouver, ISO, and other styles
47

Łopata, Stanisław, and Paweł Ocłoń. "The analysis of gradient algorithm effectiveness - two dimensional heat transfer problem." Archives of Thermodynamics 31, no. 4 (October 1, 2010): 37–50. http://dx.doi.org/10.2478/v10173-010-0026-5.

Full text
Abstract:
The analysis of gradient algorithm effectiveness - two dimensional heat transfer problemThe analysis of effectiveness of the gradient algorithm for the two-dimension steady state heat transfer problems is being performed. The three gradient algorithms - the BCG (biconjugate gradient algorithm), the BICGSTAB (biconjugate gradient stabilized algorithm), and the CGS (conjugate gradient squared algorithm) are implemented in a computer code. Because the first type boundary conditions are imposed, it is possible to compare the results with the analytical solution. Computations are carried out for different numerical grid densities. Therefore it is possible to investigate how the grid density influences the efficiency of the gradient algorithms. The total computational time, residual drop and the iteration time for the gradient algorithms are additionally compared with the performance of the SOR (successive over-relaxation) method.
APA, Harvard, Vancouver, ISO, and other styles
48

Gong, Pinghua, and Changshui Zhang. "Efficient Multi-Stage Conjugate Gradient for Trust Region Step." Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (September 20, 2021): 921–28. http://dx.doi.org/10.1609/aaai.v26i1.8272.

Full text
Abstract:
The trust region step problem, by solving a sphere constrained quadratic programming, plays a critical role in the trust region Newton method. In this paper, we propose an efficient Multi-Stage Conjugate Gradient (MSCG) algorithm to compute the trust region step in a multi-stage manner. Specifically, when the iterative solution is in the interior of the sphere, we perform the conjugate gradient procedure. Otherwise, we perform a gradient descent procedure which points to the inner of the sphere and can make the next iterative solution be a interior point. Subsequently, we proceed with the conjugate gradient procedure again. We repeat the above procedures until convergence. We also present a theoretical analysis which shows that the MSCG algorithm converges. Moreover, the proposed MSCG algorithm can generate a solution in any prescribed precision controlled by a tolerance parameter which is the only parameter we need. Experimental results on large-scale text data sets demonstrate our proposed MSCG algorithm has a faster convergence speed compared with the state-of-the-art algorithms.
APA, Harvard, Vancouver, ISO, and other styles
49

Baxter, J., P. L. Bartlett, and L. Weaver. "Experiments with Infinite-Horizon, Policy-Gradient Estimation." Journal of Artificial Intelligence Research 15 (November 1, 2001): 351–81. http://dx.doi.org/10.1613/jair.807.

Full text
Abstract:
In this paper, we present algorithms that perform gradient ascent of the average reward in a partially observable Markov decision process (POMDP). These algorithms are based on GPOMDP, an algorithm introduced in a companion paper (Baxter & Bartlett, this volume), which computes biased estimates of the performance gradient in POMDPs. The algorithm's chief advantages are that it uses only one free parameter beta, which has a natural interpretation in terms of bias-variance trade-off, it requires no knowledge of the underlying state, and it can be applied to infinite state, control and observation spaces. We show how the gradient estimates produced by GPOMDP can be used to perform gradient ascent, both with a traditional stochastic-gradient algorithm, and with an algorithm based on conjugate-gradients that utilizes gradient information to bracket maxima in line searches. Experimental results are presented illustrating both the theoretical results of (Baxter & Bartlett, this volume) on a toy problem, and practical aspects of the algorithms on a number of more realistic problems.
APA, Harvard, Vancouver, ISO, and other styles
50

Abbas H. Taqi and Amal N. Shaker. "Development Modified Conjugate Gradiente Algorithm." Tikrit Journal of Pure Science 20, no. 5 (February 10, 2023): 185–92. http://dx.doi.org/10.25130/tjps.v20i5.1254.

Full text
Abstract:
In this paper was to develop method V1-CG methods associated gradient amended to increase the speed of the convergence while retaining the charactenstic mass convergence as the derivation of this method was based on strict convex quadratic function has been develop this way to public function in whichthe supreme derivatives not equal zero. The Desent property and global convergence for the proposed algorithm are established, oure numerical experiment on some test function and it showed us a clear improvement on the way V1-CG modified.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography