To see the other types of publications on this topic, follow the link: Hybrid conjugate gradient method.

Journal articles on the topic 'Hybrid conjugate gradient method'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Hybrid conjugate gradient method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Nur, Syarafina Mohamed, Mamat Mustafa, Rivaie Mohd, and Milleana Shaharudin Shazlyn. "A new hyhbrid coefficient of conjugate gradient method." Indonesian Journal of Electrical Engineering and Computer Science (IJEECS) 18, no. 3 (2020): 1454–63. https://doi.org/10.11591/ijeecs.v18.i3.pp1454-1463.

Full text
Abstract:
Hybridization is one of the popular approaches in modifying the conjugate gradient method. In this paper, a new hybrid conjugate gradient is suggested and analyzed in which the parameter k is evaluated as a convex combination of RMIL k while using exact line search. The proposed method is shown to possess both sufficient descent and global convergence properties. Numerical performances show that the proposed method is promising and has overpowered other hybrid conjugate gradient methods in its number of iterations and central processing unit per time.
APA, Harvard, Vancouver, ISO, and other styles
2

Hafaidia, I., H. Guebbai, M. Al-Baali, and M. Ghiat. "A new hybrid conjugate gradient algorithm for unconstrained optimization." Vestnik Udmurtskogo Universiteta. Matematika. Mekhanika. Komp'yuternye Nauki 33, no. 2 (2023): 348–64. http://dx.doi.org/10.35634/vm230211.

Full text
Abstract:
It is well known that conjugate gradient methods are useful for solving large-scale unconstrained nonlinear optimization problems. In this paper, we consider combining the best features of two conjugate gradient methods. In particular, we give a new conjugate gradient method, based on the hybridization of the useful DY (Dai-Yuan), and HZ (Hager-Zhang) methods. The hybrid parameters are chosen such that the proposed method satisfies the conjugacy and sufficient descent conditions. It is shown that the new method maintains the global convergence property of the above two methods. The numerical r
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Jinkui. "A hybrid nonlinear conjugate gradient method." Lobachevskii Journal of Mathematics 33, no. 3 (2012): 195–99. http://dx.doi.org/10.1134/s1995080212030092.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hadji, Ghania, Yamina Laskri, Tahar Bechouat, and Rachid Benzine. "New Hybrid Conjugate Gradient Method as a Convex Combination of PRP and RMIL+ Methods." Studia Universitatis Babes-Bolyai Matematica 69, no. 2 (2024): 457–68. http://dx.doi.org/10.24193/subbmath.2024.2.14.

Full text
Abstract:
The Conjugate Gradient (CG) method is a powerful iterative approach for solving large-scale minimization problems, characterized by its simplicity, low computation cost and good convergence. In this paper, a new hybrid conjugate gradient HLB method (HLB: Hadji-Laskri-Bechouat) is proposed and analysed for unconstrained optimization. By comparing numerically CGHLB with PRP and RMIL+ and by using the Dolan and More CPU performance, we deduce that CGHLB is more efficient. Keywords: Unconstrained optimization, hybrid conjugate gradient method, line search, descent property, global convergence.
APA, Harvard, Vancouver, ISO, and other styles
5

Ding, Lei, Yong Jun Luo, Yang Yang Wang, Zheng Li, and Bing Yin Yao. "Improved Method of Hybrid Genetic Algorithm." Applied Mechanics and Materials 556-562 (May 2014): 4014–17. http://dx.doi.org/10.4028/www.scientific.net/amm.556-562.4014.

Full text
Abstract:
On account of low convergence of the traditional genetic algorithm in the late,a hybrid genetic algorithm based on conjugate gradient method and genetic algorithm is proposed.This hybrid algorithm takes advantage of Conjugate Gradient’s certainty, but also the use of genetic algorithms in order to avoid falling into local optimum, so it can quickly converge to the exact global optimal solution. Using Two test functions for testing, shows that performance of this hybrid genetic algorithm is better than single conjugate gradient method and genetic algorithm and have achieved good results.
APA, Harvard, Vancouver, ISO, and other styles
6

Fang, Minglei, Min Wang, Min Sun, and Rong Chen. "A Modified Hybrid Conjugate Gradient Method for Unconstrained Optimization." Journal of Mathematics 2021 (February 22, 2021): 1–9. http://dx.doi.org/10.1155/2021/5597863.

Full text
Abstract:
The nonlinear conjugate gradient algorithms are a very effective way in solving large-scale unconstrained optimization problems. Based on some famous previous conjugate gradient methods, a modified hybrid conjugate gradient method was proposed. The proposed method can generate decent directions at every iteration independent of any line search. Under the Wolfe line search, the proposed method possesses global convergence. Numerical results show that the modified method is efficient and robust.
APA, Harvard, Vancouver, ISO, and other styles
7

Djordjevic, Snezana. "New hybrid conjugate gradient method as a convex combination of FR and PRP methods." Filomat 30, no. 11 (2016): 3083–100. http://dx.doi.org/10.2298/fil1611083d.

Full text
Abstract:
We consider a newhybrid conjugate gradient algorithm,which is obtained fromthe algorithmof Fletcher-Reeves, and the algorithmof Polak-Ribi?re-Polyak. Numerical comparisons show that the present hybrid conjugate gradient algorithm often behaves better than some known algorithms.
APA, Harvard, Vancouver, ISO, and other styles
8

Yang, Xiangfei, Zhijun Luo, and Xiaoyu Dai. "A Global Convergence of LS-CD Hybrid Conjugate Gradient Method." Advances in Numerical Analysis 2013 (October 22, 2013): 1–5. http://dx.doi.org/10.1155/2013/517452.

Full text
Abstract:
Conjugate gradient method is one of the most effective algorithms for solving unconstrained optimization problem. In this paper, a modified conjugate gradient method is presented and analyzed which is a hybridization of known LS and CD conjugate gradient algorithms. Under some mild conditions, the Wolfe-type line search can guarantee the global convergence of the LS-CD method. The numerical results show that the algorithm is efficient.
APA, Harvard, Vancouver, ISO, and other styles
9

Yao, Shengwei, and Bin Qin. "A Hybrid of DL and WYL Nonlinear Conjugate Gradient Methods." Abstract and Applied Analysis 2014 (2014): 1–9. http://dx.doi.org/10.1155/2014/279891.

Full text
Abstract:
The conjugate gradient method is an efficient method for solving large-scale nonlinear optimization problems. In this paper, we propose a nonlinear conjugate gradient method which can be considered as a hybrid of DL and WYL conjugate gradient methods. The given method possesses the sufficient descent condition under the Wolfe-Powell line search and is globally convergent for general functions. Our numerical results show that the proposed method is very robust and efficient for the test problems.
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Linjun, Liu Xu, Youxiang Xie, Yixian Du, and Xiao Han. "A new hybrid conjugate gradient method for dynamic force reconstruction." Advances in Mechanical Engineering 11, no. 1 (2019): 168781401882236. http://dx.doi.org/10.1177/1687814018822360.

Full text
Abstract:
A new hybrid conjugate gradient method is proposed in this article based on the gradient operator and applied to the structural dynamic load identification problem. It has proved that the present method with the strong Wolfe line search possesses sufficient descent property. In addition, the present method is globally convergent when the parameter in the strong Wolfe line search conditions is restricted in some suitable intervals. Three example problems from engineering are solved by the newly developed conjugate gradient method to demonstrate the robustness and effectiveness of conjugate grad
APA, Harvard, Vancouver, ISO, and other styles
11

Mohammed Zaki, Sara Sahib, Hawraz Nadhim Jabbar, and Sozan Saber Haider. "A Novel Conjugate Gradient Algorithm as a Convex Combination of Classical Conjugate Gradient Methods." Kurdistan Journal of Applied Research 10, no. 1 (2025): 83–98. https://doi.org/10.24017/science.2025.1.6.

Full text
Abstract:
Conjugate gradient (CG) algorithms are constructive for handling large-scale nonlinear optimization problems. One optimization technique intended to address unconstrained optimization issues effectively is the hybrid conjugate gradient (HCG) algorithm. The HCG algorithm aims to improve convergence properties while keeping computations simple by merging features from other conjugate gradient techniques. In this paper, a new hybrid conjugate gradient algorithm is proposed and analyzed, which is obtained as a convex combination of the Dai-Yuan (DY), Hestenes-Stiefel (HS) and Harger-Zhan (HZ) conj
APA, Harvard, Vancouver, ISO, and other styles
12

Hassan, Basim A. "A New Hybrid Conjugate Gradient Method with Guaranteed Descent for Unconstraint Optimization." Al-Mustansiriyah Journal of Science 28, no. 3 (2018): 193. http://dx.doi.org/10.23851/mjs.v28i3.114.

Full text
Abstract:
The conjugate gradient method an efficient technique for solving the unconstrained optimization problem. In this paper, we propose a new hybrid nonlinear conjugate gradient methods, which have the descent at every iteration and globally convergence properties under certain conditions. The numerical results show that new hybrid method are efficient for the given test problems.
APA, Harvard, Vancouver, ISO, and other styles
13

Khaleel, Layth Riyadh, and Ban Ahmed Mitras. "A Novel Hybrid Dragonfly Algorithm with Modified Conjugate Gradient Method." International Journal of Computer Networks and Communications Security 8, no. 2 (2020): 17–25. http://dx.doi.org/10.47277/ijcncs/8(2)2.

Full text
Abstract:
Dragonfly Algorithm (DA) is a meta-heuristic algorithm, It is a new algorithm proposed by Mirjalili in (2015) and it simulate the behavior of dragonflies in their search for food and migration. In this paper, a modified conjugate gradient algorithm is proposed by deriving new conjugate coefficient. The sufficient descent and the global convergence properties for the proposed algorithm are proved. Novel hybrid algorithm of the dragonfly (DA) was proposed with modified conjugate gradient Algorithm which develops the elementary society that is randomly generated as the primary society for the dra
APA, Harvard, Vancouver, ISO, and other styles
14

Hamdi, Amira, Badreddine Sellami, and Mohammed Belloufi. "New hyrid conjugate gradient method as a convex combination of HZ and CD methods." Asian-European Journal of Mathematics 14, no. 10 (2021): 2150187. http://dx.doi.org/10.1142/s1793557121501874.

Full text
Abstract:
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems, the conjugate gradient parameter [Formula: see text] is computed as a convex combination of [Formula: see text] and [Formula: see text]. Under the wolfe line search, we prove the sufficient descent and the global convergence. Numerical results are reported to show the effectiveness of our procedure.
APA, Harvard, Vancouver, ISO, and other styles
15

Babaie-Kafaki, Saman, and Nezam Mahdavi-Amiri. "TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION." Mathematical Modelling and Analysis 18, no. 1 (2013): 32–52. http://dx.doi.org/10.3846/13926292.2013.756832.

Full text
Abstract:
Taking advantage of the attractive features of Hestenes–Stiefel and Dai–Yuan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the efficiency of the proposed methods are reported.
APA, Harvard, Vancouver, ISO, and other styles
16

Yusuf, Aliyu, Abdullahi Adamu Kiri, Lukman Lawal, and Aliyu Ibrahim Kiri. "A Hybrid Conjugate Gradient Algorithm for Nonlinear System of Equations through Conjugacy Condition." Journal of Global Humanities and Social Sciences 5, no. 10 (2024): 364–71. http://dx.doi.org/10.61360/bonighss242016851001.

Full text
Abstract:
For the purpose of solving a large-scale system of nonlinear equations, a hybrid conjugate gradient algorithm is introduced in this paper, based on the convex combination of βFR k and βPRP k parameters. It is made possible by incorporating the conjugacy condition together with the proposed conjugate gradient search direction. Furthermore, a significant property of the method is that through a non-monotone type line search it gives a descent search direction. Under appropriate conditions, the algorithm establishes its global convergence. Finally, results from numerical tests on a set of benchma
APA, Harvard, Vancouver, ISO, and other styles
17

Dauda, M. K., Abubakar S. Magaji, Habib Abdullah, Jamilu Sabi'u, and Abubakar S. Halilu. "A New Search Direction via Hybrid Conjugate Gradient Coefficient for Solving Nonlinear System of Equations." Malaysian Journal of Computing and Applied Mathematics 2, no. 1 (2019): 8–15. http://dx.doi.org/10.37231/myjcam.2019.2.1.24.

Full text
Abstract:
Conjugate gradient methods are widely used for unconstrained optimization problems. Most of the conjugate gradient methods do not always generate a descent search direction. In this article, a new search direction via hybrid conjugate gradient method by convex combination of the earlier works is used and extended for the solution of nonlinear system of equations. The proposed search direction generates a descent direction at every iteration and without convexity assumption on the objective function. The new method possesses, preserved and inherits the properties of the Conjugate Gradient (CG)
APA, Harvard, Vancouver, ISO, and other styles
18

F. Aziz, Rahma, and Maha S. Younis. "A New Hybrid Conjugate Gradient Method with Global Convergence Properties." Wasit Journal for Pure sciences 3, no. 3 (2024): 58–68. http://dx.doi.org/10.31185/wjps.453.

Full text
Abstract:
This work introduces a novel hybrid conjugate gradient (CG) technique for tackling unconstrained optimisation problems with improved efficiency and effectiveness. The parameter is computed as a convex combination of the standard conjugate gradient techniques using and . Our proposed method has shown that when using the strong Wolfe-line-search (SWC) under specific conditions, it achieves global theoretical convergence. In addition, the new hybrid CG approach has the ability to generate a search direction that moves downward with each iteration. The quantitative findings obtained by applying th
APA, Harvard, Vancouver, ISO, and other styles
19

Salihu, Nasiru, Mathew Remilekun Odekunle, Mohammed Yusuf Waziri, Abubakar Sani Halilu, and Suraj Salihu. "A dai-liao hybrid conjugate gradient method for unconstrained optimization." International Journal of Industrial Optimization 2, no. 2 (2021): 69. http://dx.doi.org/10.12928/ijio.v2i2.4100.

Full text
Abstract:
One of todays’ best-performing CG methods is Dai-Liao (DL) method which depends on non-negative parameter and conjugacy conditions for its computation. Although numerous optimal selections for the parameter were suggested, the best choice of remains a subject of consideration. The pure conjugacy condition adopts an exact line search for numerical experiments and convergence analysis. Though, a practical mathematical experiment implies using an inexact line search to find the step size. To avoid such drawbacks, Dai and Liao substituted the earlier conjugacy condition with an extended conjugacy
APA, Harvard, Vancouver, ISO, and other styles
20

Ibrahim, Mohd Asrul Hery, Mustafa Mamat, and Wah June Leong. "The Hybrid BFGS-CG Method in Solving Unconstrained Optimization Problems." Abstract and Applied Analysis 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/507102.

Full text
Abstract:
In solving large scale problems, the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems. Hence, a new hybrid method, known as the BFGS-CG method, has been created based on these properties, combining the search direction between conjugate gradient methods and quasi-Newton methods. In comparison to standard BFGS methods and conjugate gradient methods, the BFGS-CG method shows significant improvement in the total number of iterations and CPU time required to solve large scale unconstrained optimization problems. We also prove that the hybrid
APA, Harvard, Vancouver, ISO, and other styles
21

Kaelo, Pro, Sindhu Narayanan, and M. V. Thuto. "A modified quadratic hybridization of Polak-Ribiere-Polyak and Fletcher-Reeves conjugate gradient method for unconstrained optimization problems." An International Journal of Optimization and Control: Theories & Applications (IJOCTA) 7, no. 2 (2017): 177–85. http://dx.doi.org/10.11121/ijocta.01.2017.00339.

Full text
Abstract:
This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.
APA, Harvard, Vancouver, ISO, and other styles
22

Zhang, Jinchao, Wei Zhu, Wei Wang, Zhaochong Wu, and Xiaojun Zhang. "An accelerated hybrid Riemannian conjugate gradient method for unconstrained optimization." Journal of Physics: Conference Series 2755, no. 1 (2024): 012012. http://dx.doi.org/10.1088/1742-6596/2755/1/012012.

Full text
Abstract:
Abstract With the goal to deal with a series of optimization problems on general matrix manifolds with differentiable objective functions, we propose an accelerated hybrid Riemannian conjugate gradient technique. Specifically, the acceleration scheme of the proposed method using a modified stepsize which is multiplicatively determined by the Wolfe line search. The search direction of the proposed algorithm is determined by the hybrid conjugate parameter with computationally promising. We showed that the suggested approach converges globally to a stationary point. Our approach performs better t
APA, Harvard, Vancouver, ISO, and other styles
23

Li, Xiangrong, and Xupei Zhao. "A hybrid conjugate gradient method for optimization problems." Natural Science 03, no. 01 (2011): 85–90. http://dx.doi.org/10.4236/ns.2011.31012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Liu, J. K., and S. J. Li. "New hybrid conjugate gradient method for unconstrained optimization." Applied Mathematics and Computation 245 (October 2014): 36–43. http://dx.doi.org/10.1016/j.amc.2014.07.096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Kaelo, P., P. Mtagulwa, and M. V. Thuto. "A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization." Mathematical Sciences 14, no. 1 (2019): 1–9. http://dx.doi.org/10.1007/s40096-019-00310-y.

Full text
Abstract:
Abstract In this paper, we develop a new hybrid conjugate gradient method that inherits the features of the Liu and Storey (LS), Hestenes and Stiefel (HS), Dai and Yuan (DY) and Conjugate Descent (CD) conjugate gradient methods. The new method generates a descent direction independently of any line search and possesses good convergence properties under the strong Wolfe line search conditions. Numerical results show that the proposed method is robust and efficient.
APA, Harvard, Vancouver, ISO, and other styles
26

Mohamed, Nur Syarafina, Mustafa Mamat, Mohd Rivaie, and Shazlyn Milleana Shaharudin. "A new hyhbrid coefficient of conjugate gradient method." Indonesian Journal of Electrical Engineering and Computer Science 18, no. 3 (2020): 1454. http://dx.doi.org/10.11591/ijeecs.v18.i3.pp1454-1463.

Full text
Abstract:
<p><span>Hybridization is one of the popular approaches in modifying the conjugate gradient method. In this paper, a new hybrid conjugate gradient is suggested and analyzed in which the parameter <!--[if gte mso 9]><xml>
 <o:OLEObject Type="Embed" ProgID="Equation.3" ShapeID="_x0000_i1025"
 DrawAspect="Content" ObjectID="_1640083713">
 </o:OLEObject>
 </xml><![endif]-->is evaluated as a convex combination of <!--[if gte mso 9]><xml>
 <o:OLEObject Type="Embed" ProgID="Equation.3" ShapeID="_x0000_i1026"
APA, Harvard, Vancouver, ISO, and other styles
27

Zeyad Mohammed Abdullah and Iman Khalid Jamalaldeen. "A New Hybrid of DY and CGSD Conjugate Gradient Methods for Solving Unconstrained Optimization Problems." Tikrit Journal of Pure Science 26, no. 5 (2021): 86–91. http://dx.doi.org/10.25130/tjps.v26i5.183.

Full text
Abstract:
In this article, we present a new hybrid conjugate gradient method for solving large Scale in unconstrained optimization problems. This method is a convex combination of Dai-Yuan conjugate gradient and Andrei- sufficient descent condition, satisfies the famous D-L conjugacy condition and in the same time solidarities with the newton direction with the suitable condition. The suggestion method always yields a descent search direction at each it iteration. Under strong wolfe powell(SWP) line search condition, the direction satisfy the global convergence of the proposed method is established. Fin
APA, Harvard, Vancouver, ISO, and other styles
28

Babaie-Kafaki, Saman. "A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter." Optimization 62, no. 7 (2011): 929–41. http://dx.doi.org/10.1080/02331934.2011.611512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Hassan, Basim A., Ahmed O. Owaid, and Zena T. Yasen. "A variant of hybrid conjugate gradient methods based on the convex combination for optimization." Indonesian Journal of Electrical Engineering and Computer Science 20, no. 2 (2020): 1007–15. https://doi.org/10.11591/ijeecs.v20.i2.pp1007-1015.

Full text
Abstract:
On some studies a conjugate parameter plays an important role for the conjugate gradient methods. In this paper, a variant of hybrid is provided in the search direction based on the convex combination. This search direction ensures that the descent condition holds. The global convergence of the variant of hybrid is also obtained. Our strong evidence is a numerical analysis showing that the proposed variant of hybrid method is efficient than the Hestenes and Stiefel method.
APA, Harvard, Vancouver, ISO, and other styles
30

Adeleke, Olawale Joshua, Absalom El-Shamir Ezugwu, and Idowu Ademola Osinuga. "A New Family of Hybrid Conjugate Gradient Methods for Unconstrained Optimization." Statistics, Optimization & Information Computing 9, no. 2 (2020): 399–417. http://dx.doi.org/10.19139/soic-2310-5070-480.

Full text
Abstract:
The conjugate gradient method is a very efficient iterative technique for solving large-scale unconstrainedoptimization problems. Motivated by recent modifications of some variants of the method and construction of hybrid methods, this study proposed four hybrid methods that are globally convergent as well as computationally efficient. The approach adopted for constructing the hybrid methods entails projecting ten recently modified conjugate gradient methods. Each of the hybrid methods is shown to satisfy the descent property independent of any line search technique and globally convergent und
APA, Harvard, Vancouver, ISO, and other styles
31

Stanimirović, Predrag S., Branislav Ivanov, Snežana Djordjević, and Ivona Brajević. "New Hybrid Conjugate Gradient and Broyden–Fletcher–Goldfarb–Shanno Conjugate Gradient Methods." Journal of Optimization Theory and Applications 178, no. 3 (2018): 860–84. http://dx.doi.org/10.1007/s10957-018-1324-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Mohanty, Nirmalya Kumar, and Rupaj Kumar Nayak. "A new efficient hybrid conjugate gradient method based on LS-DY-HS conjugate gradient parameter." International Journal of Mathematical Modelling and Numerical Optimisation 10, no. 4 (2020): 342. http://dx.doi.org/10.1504/ijmmno.2020.10031725.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Mohanty, Nirmalya Kumar, and Rupaj Kumar Nayak. "A new efficient hybrid conjugate gradient method based on LS-DY-HS conjugate gradient parameter." International Journal of Mathematical Modelling and Numerical Optimisation 10, no. 4 (2020): 342. http://dx.doi.org/10.1504/ijmmno.2020.110702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ng, Kin Wei, and Ahmad Rohanin. "Solving Optimal Control Problem of Monodomain Model Using Hybrid Conjugate Gradient Methods." Mathematical Problems in Engineering 2012 (2012): 1–14. http://dx.doi.org/10.1155/2012/734070.

Full text
Abstract:
We present the numerical solutions for the PDE-constrained optimization problem arising in cardiac electrophysiology, that is, the optimal control problem of monodomain model. The optimal control problem of monodomain model is a nonlinear optimization problem that is constrained by the monodomain model. The monodomain model consists of a parabolic partial differential equation coupled to a system of nonlinear ordinary differential equations, which has been widely used for simulating cardiac electrical activity. Our control objective is to dampen the excitation wavefront using optimal applied e
APA, Harvard, Vancouver, ISO, and other styles
35

Sabi'u, Jamilu, and Abubakar M. Gadu. "A Projected Hybrid Conjugate Gradient Method for Solving Large-scale System of Nonlinear Equations." Malaysian Journal of Computing and Applied Mathematics 1, no. 2 (2018): 10–20. http://dx.doi.org/10.37231/myjcam.2018.1.2.20.

Full text
Abstract:
In this article, a fully derivative-free projected hybrid conjugate gradient method for solving large-scale systems of nonlinear equations is proposed. The proposed method is the convex combination of FR and PRP conjugate gradient methods with the projection method. However, the global convergence of the given method is established under suitable conditions with nonmonotone line search. Numerical results show that the method is efficient for large-scale problems.
APA, Harvard, Vancouver, ISO, and other styles
36

Warsito, Budi, Alan Prahutama, Hasbi Yasin, and Sri Sumiyati. "Hybrid Particle Swarm and Conjugate Gradient Optimization in Neural Network for Prediction of Suspended Particulate Matter." E3S Web of Conferences 125 (2019): 25007. http://dx.doi.org/10.1051/e3sconf/201912525007.

Full text
Abstract:
The scope of this research is the use of artificial neural network models and meta-heuristic optimization of Particle Swarm Optimization (PSO) for the prediction of ambient air pollution parameter data at air quality monitoring stations in the city of Semarang, Central Java. The observed parameter is an indicator of ambient air quality, Suspended Particulate Matter (SPM). Based on air quality parameter data in previous times which is a time series data, modeling is done using Neural Networks (NN). Estimation of weights from NN is done using a hybrid method between meta-heuristic and gradient o
APA, Harvard, Vancouver, ISO, and other styles
37

Babaie-Kafaki, Saman. "Addendum to: A hybrid conjugate gradient method based on a quadratic relaxation of Dai–Yuan hybrid conjugate gradient parameter." Optimization 63, no. 4 (2012): 657–59. http://dx.doi.org/10.1080/02331934.2012.718347.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Yuan, Gonglin. "A Conjugate Gradient Method for Unconstrained Optimization Problems." International Journal of Mathematics and Mathematical Sciences 2009 (2009): 1–14. http://dx.doi.org/10.1155/2009/329623.

Full text
Abstract:
A hybrid method combining the FR conjugate gradient method and the WYL conjugate gradient method is proposed for unconstrained optimization problems. The presented method possesses the sufficient descent property under the strong Wolfe-Powell (SWP) line search rule relaxing the parameterσ<1. Under the suitable conditions, the global convergence with the SWP line search rule and the weak Wolfe-Powell (WWP) line search rule is established for nonconvex function. Numerical results show that this method is better than the FR method and the WYL method.
APA, Harvard, Vancouver, ISO, and other styles
39

Zullpakkal, Norhaslinda, N. ‘Aini, N. H. A. Ghani, N. S. Mohamed, N. Idalisa, and M. Rivaie. "Covid-19 data modelling using hybrid conjugate gradient method." Journal of Information and Optimization Sciences 43, no. 4 (2022): 837–53. http://dx.doi.org/10.1080/02522667.2022.2060610.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Salih, Y., M. A. Hamoda, Sukono, and M. Mamat. "The convergence properties of new hybrid conjugate gradient method." IOP Conference Series: Materials Science and Engineering 567 (August 15, 2019): 012031. http://dx.doi.org/10.1088/1757-899x/567/1/012031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Li, Jing. "A Hybrid Spectral Conjugate Gradient Method with Global Convergence." International Journal of Mathematical Sciences and Computing 8, no. 2 (2022): 1–10. http://dx.doi.org/10.5815/ijmsc.2022.02.01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Jameel, Marwan S., Ali A. Al-Arbo, and Rana Z. Al-Kawaz. "Robust and Hybrid Conjugate Gradient Method for Modern Unconstrained Optimization Methods." Mathematical Modelling of Engineering Problems 12, no. 6 (2025): 2170–76. https://doi.org/10.18280/mmep.120632.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Diphofu, T., P. Kaelo, and A. R. Tufa. "A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization." Topological Algebra and its Applications 10, no. 1 (2022): 47–60. http://dx.doi.org/10.1515/taa-2022-0112.

Full text
Abstract:
Abstract Conjugate gradient methods are very popular for solving large scale unconstrained optimization problems because of their simplicity to implement and low memory requirements. In this paper, we present a hybrid three-term conjugate gradient method with a direction that always satisfies the sufficient descent condition. We establish global convergence of the new method under the weak Wolfe line search conditions. We also report some numerical results of the proposed method compared to relevant methods in the literature.
APA, Harvard, Vancouver, ISO, and other styles
44

Zeyad Mohammed Abdullah and Faris Saleem Dhyab. "New Hybrid Conjugate Gradient Method as a Convex Combination of Dai–Liao and Wei–Yao–Liu." Tikrit Journal of Pure Science 26, no. 6 (2021): 100–105. http://dx.doi.org/10.25130/tjps.v26i6.199.

Full text
Abstract:
In this research, a new method of hybrid conjugated gradient methods was developed. This method is based mainly on the hybridization of Dia-Laio and Wei-Yao-Liu algorithms, by using convex fitting and conjugate condition of line Uncontrolled search. The resulting algorithm fulfills the condition of sufficient proportions and has universal convergence under certain assumptions. The numerical results indicated the efficiency of this method in solving nonlinear test functions in the given unconstrained optimization.
APA, Harvard, Vancouver, ISO, and other styles
45

Ishak, Muhammad Aqiil Iqmal, Nurin Athirah Azmi, and Siti Mahani Marjugi. "Three-term PRP-DL Method Modification with Application in Image Denoising Problem." Applied Mathematics and Computational Intelligence (AMCI) 14, no. 2 (2025): 95–118. https://doi.org/10.58915/amci.v14i2.1308.

Full text
Abstract:
Image denoising poses a critical challenge due to the impact of noise on image quality and the need to preserve essential details. This study introduces a hybrid Polak-Ribiére-Polyak (PRP)-Dai-Liao (DL) conjugate gradient method with a modified scalar to improve the performance of denoising algorithms on large-scale images. The proposed method involves modifying the scalar in the PRP-DL conjugate gradient method, thereby enhancing algorithmic efficiency, especially in handling large-scale problems. Convergence analysis under the standard Wolfe-Powell line search is established, and numerical r
APA, Harvard, Vancouver, ISO, and other styles
46

Nasreddine, Chenna. "A modified hybrid conjugate gradient method for solving unconstrained non-linear optimization problems has been developed." International Conference on Pioneer and Innovative Studies 1 (June 20, 2023): 524–26. http://dx.doi.org/10.59287/icpis.884.

Full text
Abstract:
This paper presents a new hybrid conjugate gradient method, combining the Conjugate Descent (CD) and Al-Bayati & Al-Assady (BA) methods, for solving unconstrained optimization problems. We provide a convergence analysis of the proposed method and demonstrate its effectiveness through numerical examples.
APA, Harvard, Vancouver, ISO, and other styles
47

Alnowibet, Khalid Abdulaziz, Salem Mahdi, Ahmad M. Alshamrani, Karam M. Sallam, and Ali Wagdy Mohamed. "A Family of Hybrid Stochastic Conjugate Gradient Algorithms for Local and Global Minimization Problems." Mathematics 10, no. 19 (2022): 3595. http://dx.doi.org/10.3390/math10193595.

Full text
Abstract:
This paper contains two main parts, Part I and Part II, which discuss the local and global minimization problems, respectively. In Part I, a fresh conjugate gradient (CG) technique is suggested and then combined with a line-search technique to obtain a globally convergent algorithm. The finite difference approximations approach is used to compute the approximate values of the first derivative of the function f. The convergence analysis of the suggested method is established. The comparisons between the performance of the new CG method and the performance of four other CG methods demonstrate th
APA, Harvard, Vancouver, ISO, and other styles
48

Fathy, Basma T., and Maha S. Younis. "Global Convergence Analysis of a new Hybrid Conjugate Gradient Method for Unconstraint Optimization Problems." Journal of Physics: Conference Series 2322, no. 1 (2022): 012063. http://dx.doi.org/10.1088/1742-6596/2322/1/012063.

Full text
Abstract:
Abstract In this study, we introduce a novel hybrid conjugate gradient [CG] to solve an efficient and effective unconstrained optimization problem. The parameter θk is a convex combination of the conjugate gradient method β k B A 2 and β k F R . Under strong Wolfe line search conditions (SWC), we have shown that this method is globally convergent, and the proposed hybrid CG method is able to generate a descending search direction at each iteration. The numerical results presented, in this paper demonstrate that the proposed strategy is both effective and promising.
APA, Harvard, Vancouver, ISO, and other styles
49

Sabi’u, Jamilu, Kanikar Muangchoo, Abdullah Shah, Auwal Bala Abubakar, and Kazeem Olalekan Aremu. "An Inexact Optimal Hybrid Conjugate Gradient Method for Solving Symmetric Nonlinear Equations." Symmetry 13, no. 10 (2021): 1829. http://dx.doi.org/10.3390/sym13101829.

Full text
Abstract:
This article presents an inexact optimal hybrid conjugate gradient (CG) method for solving symmetric nonlinear systems. The method is a convex combination of the optimal Dai–Liao (DL) and the extended three-term Polak–Ribiére–Polyak (PRP) CG methods. However, two different formulas for selecting the convex parameter are derived by using the conjugacy condition and also by combining the proposed direction with the default Newton direction. The proposed method is again derivative-free, therefore the Jacobian information is not required throughout the iteration process. Furthermore, the global co
APA, Harvard, Vancouver, ISO, and other styles
50

Rahpeymaii, Farzad, and Majid Rostami. "Solving Unconstrained Optimization Problems by a New Conjugate Gradient Method with Sufficient Descent Property." Sultan Qaboos University Journal for Science [SQUJS] 27, no. 2 (2022): 90–99. http://dx.doi.org/10.53539/squjs.vol27iss2pp90-99.

Full text
Abstract:
There have been some conjugate gradient methods with strong convergence but numerical instability and conversely‎. Improving these methods is an interesting idea to produce new methods with both strong convergence and‎‏ ‎numerical stability‎. ‎In this paper‎, ‎a new hybrid conjugate gradient method is introduced based on the Fletcher ‎formula (CD) with strong convergence and the Liu and Storey formula (LS) with good numerical results‎. ‎New directions satisfy the sufficient descent property‎, ‎independent of line search‎. ‎Under some mild assumptions‎, ‎the global convergence of new hybrid met
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!