To see the other types of publications on this topic, follow the link: Three-term conjugate gradient.

Journal articles on the topic 'Three-term conjugate gradient'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Three-term conjugate gradient.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ibrahim, Yahya Ismail, and Hisham Mohammed Khudhur. "Modified three-term conjugate gradient algorithm and its applications in image restoration." Indonesian Journal of Electrical Engineering and Computer Science 28, no. 3 (2022): 1510–17. https://doi.org/10.11591/ijeecs.v28.i3.pp1510-1517.

Full text
Abstract:
In image restoration, the goal is often to bring back a high-quality version of an image from a lower-quality copy of it. In this article, we will investigate one kind of recovery issue, namely recovering photos that have been blurred by noise in digital photographs (sometimes known as "salt and pepper" noise). When subjected to noise at varying frequencies and intensities (30,50,70,90). In this paper, we used the conjugate gradient algorithm to Restorative images and remove noise from them, we developed the conjugate gradient algorithm with three limits using the conjugate condition
APA, Harvard, Vancouver, ISO, and other styles
2

Ahmed, Alaa Saad, Hisham M. Khudhur, and Mohammed S. Najmuldeen. "A new parameter in three-term conjugate gradient algorithms for unconstrained optimization." Indonesian Journal of Electrical Engineering and Computer Science 23, no. 1 (2021): 338–44. https://doi.org/10.11591/ijeecs.v23.i1.pp338-344.

Full text
Abstract:
In this study, we develop a different parameter of three term conjugate gradient kind, this scheme depends principally on pure conjugacy condition (PCC), Whereas, the conjugacy condition (PCC) is an important condition in unconstrained non-linear optimization in general and in conjugate gradient methods in particular. The proposed method becomes converged, and satisfy conditions descent property by assuming some hypothesis, The numerical results display the effectiveness of the new method for solving test unconstrained non-linear optimization problems compared to other conjugate gradient algor
APA, Harvard, Vancouver, ISO, and other styles
3

Ahmed, Alaa Saad, Hisham M. Khudhur, and Mohammed S. Najmuldeen. "A new parameter in three-term conjugate gradient algorithms for unconstrained optimization." Indonesian Journal of Electrical Engineering and Computer Science 23, no. 1 (2021): 338. http://dx.doi.org/10.11591/ijeecs.v23.i1.pp338-344.

Full text
Abstract:
<span>In this study, we develop a different parameter of three term conjugate gradient kind, this scheme depends principally on pure conjugacy condition (PCC), Whereas, the conjugacy condition (PCC) is an important condition in unconstrained non-linear optimization in general and in conjugate gradient methods in particular. The proposed method becomes converged, and satisfy conditions descent property by assuming some hypothesis, The numerical results display the effectiveness of the new method for solving test unconstrained non-linear optimization problems compared to other conjugate gr
APA, Harvard, Vancouver, ISO, and other styles
4

Meansri, K., N. Benrabia, M. Ghiat, H. Guebbai та I. Hafaidia. "Представление TTMDY — трехчленный модифицированный метод сопряженных градиентов DY для крупномасштабных задач без ограничений". Вычислительные технологии 29, № 3 (2024): 81–91. http://dx.doi.org/10.25743/ict.2024.29.3.007.

Full text
Abstract:
The main focus of our paper is a novel approach to enhance the MDY conjugate gradient direction. The key modification involves incorporating a third term, which plays a crucial role in determining the descent direction. By introducing this additional term, we transform the MDY conjugate gradient direction into a three-term conjugate gradient direction. This modification aims to improve the convergence properties of the algorithm and enhance its performance in solving optimization problems. In comparison to traditional MDY conjugate gradient methods, our approach demon- strates improved converg
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Zhan, Pengyuan Li, Xiangrong Li, and Hongtruong Pham. "A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems." Mathematical Problems in Engineering 2020 (September 4, 2020): 1–14. http://dx.doi.org/10.1155/2020/4381515.

Full text
Abstract:
Conjugate gradient methods are well-known methods which are widely applied in many practical fields. CD conjugate gradient method is one of the classical types. In this paper, a modified three-term type CD conjugate gradient algorithm is proposed. Some good features are presented as follows: (i) A modified three-term type CD conjugate gradient formula is presented. (ii) The given algorithm possesses sufficient descent property and trust region property. (iii) The algorithm has global convergence with the modified weak Wolfe–Powell (MWWP) line search technique and projection technique for gener
APA, Harvard, Vancouver, ISO, and other styles
6

Dixon, L. C. W., P. G. Ducksbury, and P. Singh. "A new three-term conjugate gradient method." Journal of Optimization Theory and Applications 47, no. 3 (1985): 285–300. http://dx.doi.org/10.1007/bf00941495.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ahmed, Huda I., Rana Z. Al-Kawaz, and Abbas Y. Al-Bayati. "Spectral Three-Term Constrained Conjugate Gradient Algorithm for Function Minimizations." Journal of Applied Mathematics 2019 (December 25, 2019): 1–6. http://dx.doi.org/10.1155/2019/6378368.

Full text
Abstract:
In this work, we tend to deal within the field of the constrained optimization methods of three-term Conjugate Gradient (CG) technique which is primarily based on Dai–Liao (DL) formula. The new proposed technique satisfies the conjugacy property and the descent conditions of Karush–Kuhn–Tucker (K.K.T.). Our planned constrained technique uses the robust Wolfe line search condition with some assumptions. We tend to prove the global convergence property of the new planned technique. Numeral comparisons for (30-thirty) constrained optimization issues make sure the effectiveness of the new planned
APA, Harvard, Vancouver, ISO, and other styles
8

Guo, Jie, and Zhong Wan. "A new three-term conjugate gradient algorithm with modified gradient-differences for solving unconstrained optimization problems." AIMS Mathematics 8, no. 2 (2022): 2473–88. http://dx.doi.org/10.3934/math.2023128.

Full text
Abstract:
<abstract><p>Unconstrained optimization problems often arise from mining of big data and scientific computing. On the basis of a modified gradient-difference, this article aims to present a new three-term conjugate gradient algorithm to efficiently solve unconstrained optimization problems. Compared with the existing nonlinear conjugate gradient algorithms, the search directions in this algorithm are always sufficiently descent independent of any line search, as well as having conjugacy property. Using the standard Wolfe line search, global and local convergence of the proposed alg
APA, Harvard, Vancouver, ISO, and other styles
9

Abbas, Y. Al-Bayati, and M. M. Ali Muna. "New multi-step three-term conjugate gradient algorithms with inexact line searches." Indonesian Journal of Electrical Engineering and Computer Science 19, no. 3 (2020): 1564–73. https://doi.org/10.11591/ijeecs.v19.i3.pp1564-1573.

Full text
Abstract:
This work suggests several multi-step three-term Conjugate Gradient (CG)- algorithms that satisfies their sufficient descent property and conjugacy conditions. First, we considered (39) well-known three-term CG-method, and we have, therefore, suggested two new classes of this type of algorithms based on Hestenes and Stiefel (HS) and Polak-Ribière (PR) formulas with four different versions. Both descent and conjugacy conditions for all the proposed algorithms are satisfied, at each iteration by using the strong Wolfe line search condition and it's accelerated version. These new sugge
APA, Harvard, Vancouver, ISO, and other styles
10

Baluch, Bakhtawar, Zabidin Salleh, and Ahmad Alhawarat. "A New Modified Three-Term Hestenes–Stiefel Conjugate Gradient Method with Sufficient Descent Property and Its Global Convergence." Journal of Optimization 2018 (September 27, 2018): 1–13. http://dx.doi.org/10.1155/2018/5057096.

Full text
Abstract:
This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified
APA, Harvard, Vancouver, ISO, and other styles
11

Ishak, Muhammad Aqiil Iqmal, Nurin Athirah Azmi, and Siti Mahani Marjugi. "Three-term PRP-DL Method Modification with Application in Image Denoising Problem." Applied Mathematics and Computational Intelligence (AMCI) 14, no. 2 (2025): 95–118. https://doi.org/10.58915/amci.v14i2.1308.

Full text
Abstract:
Image denoising poses a critical challenge due to the impact of noise on image quality and the need to preserve essential details. This study introduces a hybrid Polak-Ribiére-Polyak (PRP)-Dai-Liao (DL) conjugate gradient method with a modified scalar to improve the performance of denoising algorithms on large-scale images. The proposed method involves modifying the scalar in the PRP-DL conjugate gradient method, thereby enhancing algorithmic efficiency, especially in handling large-scale problems. Convergence analysis under the standard Wolfe-Powell line search is established, and numerical r
APA, Harvard, Vancouver, ISO, and other styles
12

Sun, Zhongbo, Yantao Tian, and Hongyang Li. "Two Modified Three-Term Type Conjugate Gradient Methods and Their Global Convergence for Unconstrained Optimization." Mathematical Problems in Engineering 2014 (2014): 1–9. http://dx.doi.org/10.1155/2014/394096.

Full text
Abstract:
Two modified three-term type conjugate gradient algorithms which satisfy both the descent condition and the Dai-Liao type conjugacy condition are presented for unconstrained optimization. The first algorithm is a modification of the Hager and Zhang type algorithm in such a way that the search direction is descent and satisfies Dai-Liao’s type conjugacy condition. The second simple three-term type conjugate gradient method can generate sufficient decent directions at every iteration; moreover, this property is independent of the steplength line search. Also, the algorithms could be considered a
APA, Harvard, Vancouver, ISO, and other styles
13

Alhawarat, Ahmad, Hanan Alolaiyan, Ibtisam A. Masmali, Zabidin Salleh, and Shahrina Ismail. "A Descent Four-Term of Liu and Storey Conjugate Gradient Method for Large Scale Unconstrained Optimization Problems." European Journal of Pure and Applied Mathematics 14, no. 4 (2021): 1429–56. http://dx.doi.org/10.29020/nybg.ejpam.v14i4.4128.

Full text
Abstract:
The conjugate gradient (CG) method is a useful tool for obtaining the optimum point for unconstrained optimization problems since it does not require a second derivative or its approximations. Moreover, the conjugate gradient method can be applied in many fields such as machine learning, deep learning, neural network, and many others. This paper constructs a four-term conjugate gradient method that satisfies the descent property and convergence properties to obtain the stationary point. The new modification was constructed based on Liu and Storey's conjugate gradient method, two-term conjugate
APA, Harvard, Vancouver, ISO, and other styles
14

M.M., Mahdi. "Three-Term of New Conjugate Gradient Projection Approach under Wolfe Condition to Solve Unconstrained Optimization Problems." Journal of Advanced Research in Dynamical and Control Systems 12, no. 7 (2020): 788–95. http://dx.doi.org/10.5373/jardcs/v12i7/20202063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Al-Bayati, Abbas Younis, and Muna M. M. Ali. "New multi-step three-term conjugate gradient algorithms with inexact line searches." Indonesian Journal of Electrical Engineering and Computer Science 19, no. 3 (2020): 1564. http://dx.doi.org/10.11591/ijeecs.v19.i3.pp1564-1573.

Full text
Abstract:
<p>This work suggests several multi-step three-term Conjugate Gradient (CG)-algorithms that satisfies their sufficient descent property and conjugacy conditions. First, we have considered a number of well-known three-term CG-method, and we have, therefore, suggested two new classes of this type of algorithms which was based on Hestenes and Stiefel (HS) and Polak-Ribière (PR) formulas with four different versions. Both descent and conjugacy conditions for all the proposed algorithms are satisfied, at each iteration by using the strong Wolfe line search condition and it's accelerated versi
APA, Harvard, Vancouver, ISO, and other styles
16

Liu, Jiankun, and Shouqiang Du. "Modified Three-Term Conjugate Gradient Method and Its Applications." Mathematical Problems in Engineering 2019 (April 17, 2019): 1–9. http://dx.doi.org/10.1155/2019/5976595.

Full text
Abstract:
We propose a modified three-term conjugate gradient method with the Armijo line search for solving unconstrained optimization problems. The proposed method possesses the sufficient descent property. Under mild assumptions, the global convergence property of the proposed method with the Armijo line search is proved. Due to simplicity, low storage, and nice convergence properties, the proposed method is used to solve M-tensor systems and a kind of nonsmooth optimization problems with l1-norm. Finally, the given numerical experiments show the efficiency of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
17

Andrei, Neculai. "On three-term conjugate gradient algorithms for unconstrained optimization." Applied Mathematics and Computation 219, no. 11 (2013): 6316–27. http://dx.doi.org/10.1016/j.amc.2012.11.097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Lotfi, Mina, and Mohammad Hosseini. "Asufficient descent three-term conjugate gradient method and its global convergence." Filomat 38, no. 12 (2024): 4101–15. https://doi.org/10.2298/fil2412101l.

Full text
Abstract:
Inthis paper, we presented a new three-term conjugate gradient method based on combining the conjugate gradient method proposed by Cheng et al [15] with the idea of the modified FR method [22]. In our method, search direction satisfies the sufficient descent condition independent of the line search. Under some standard assumptions, we establish the global convergence property and the r-linear convergence rate of the proposed method. Numerical results on the standard test problems, in some well-known library, illustrate computational efficiency of the new method.
APA, Harvard, Vancouver, ISO, and other styles
19

Abbas Y. Al-Bayati, Abbas H. Taqi, and Yoksal A. Sadiq. "A New effected Three-Term Hestenes-Stiefel Conjugate-Gradient Method for Solving Unconstrained Optimization Problems." Tikrit Journal of Pure Science 21, no. 3 (2023): 187–93. http://dx.doi.org/10.25130/tjps.v21i3.1015.

Full text
Abstract:
In this paper a new three–term Conjugate Gradient (CG) method is suggested, the derivation of the method based on the descent property and conjugacy condition, the global convergence property is analyzed; numerical results indicate that the new proposed CG-method is well compared against other similar CG-methods in this field.
APA, Harvard, Vancouver, ISO, and other styles
20

Farhan Khalaf Muord. "Three Modified Three-Term Conjugate Gradient Method in Non- linear Optimization." Advances in Nonlinear Variational Inequalities 27, no. 2 (2024): 430–38. http://dx.doi.org/10.52783/anvi.v27.1042.

Full text
Abstract:
In this work several a Tree –Term Conjugate -Gradient (CG) – algorithms are Modified; and that satisfies the sufficient descent condition and a global convergence . primariliy, us- have derive three new methods of this type algorithms . and we compared with (HS) , (PR) and (LS). Algorithms by using (15) Well-Known test functions. And by using Wolfe and Strong Wolfe condition on every iteration. These new algorithms are used different memory less (BFGS) algorithms. Our new numerical results are proved very robust and efficient than the same classes of algorithms.
APA, Harvard, Vancouver, ISO, and other styles
21

Dong, XiaoLiang, HongWei Liu, YuBo He, Saman Babaie-Kafaki, and Reza Ghanbari. "A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION." Mathematical Modelling and Analysis 21, no. 3 (2016): 399–411. http://dx.doi.org/10.3846/13926292.2016.1176965.

Full text
Abstract:
In this paper, we propose a three–term PRP–type conjugate gradient method which always satisfies the sufficient descent condition independently of line searches employed. An important property of our method is that its direction is closest to the direction of the Newton method or satisfies conjugacy condition as the iterations evolve. In addition, under mild condition, we prove global convergence properties of the proposed method. Numerical comparison illustrates that our proposed method is efficient for solving the optimization problems.
APA, Harvard, Vancouver, ISO, and other styles
22

Arman, Ladan, Yuanming Xu, and Long Liping. "Some generalized three-term conjugate gradient methods based on CD approach for unconstrained optimization problems." Technium: Romanian Journal of Applied Sciences and Technology 3, no. 2 (2021): 67–82. http://dx.doi.org/10.47577/technium.v3i2.1983.

Full text
Abstract:
Abstract In this paper, based on the efficient Conjugate Descent (CD) method, two generalized CD algorithms are proposed to solve the unconstrained optimization problems. These methods are three-term conjugate gradient methods which the generated directions by using the conjugate gradient parameters and independent of the line search satisfy in the sufficient descent condition. Furthermore, under the strong Wolfe line search, the global convergence of the proposed methods are proved. Also, the preliminary numerical results on the CUTEst collection are presented to show effectiveness of our met
APA, Harvard, Vancouver, ISO, and other styles
23

Deepho, Jitsupa, Maulana Malik, and Auwal Bala Abubakar. "Extended modified three-term conjugate gradient method for large-scale nonlinear equations." Indonesian Journal of Electrical Engineering and Computer Science 32, no. 1 (2023): 167. http://dx.doi.org/10.11591/ijeecs.v32.i1.pp167-176.

Full text
Abstract:
<p>In this research paper, we introduce a novel gradient-free modified three-term conjugate gradient method designed to solve nonlinear equations subject to convex constraints. Our approach incorporates the projection scheme, which enhances the effectiveness of the proposed method. Building upon the modified three-term conjugate gradient method for solving M-tensor systems and ℓ1- norm-based nonsmooth optimization problems, our method can be regarded as an extension of their technique. By making mild assumptions, we establish the theoretical convergence properties of our iterative method
APA, Harvard, Vancouver, ISO, and other styles
24

Jitsupa, Deepho, Malik Maulana, and Bala Abubakar Auwal. "Extended modified three-term conjugate gradient method for large-scale nonlinear equations." Extended modified three-term conjugate gradient method for large-scale nonlinear equations 32, no. 1 (2023): 167–76. https://doi.org/10.11591/ijeecs.v32.i1.pp167-176.

Full text
Abstract:
In this research paper, we introduce a novel gradient-free modified three-term conjugate gradient method designed to solve nonlinear equations subject to convex constraints. Our approach incorporates the projection scheme, which enhances the effectiveness of the proposed method. Building upon the modified three-term conjugate gradient method for solving M-tensor systems and ℓ1- norm-based nonsmooth optimization problems, our method can be regarded as an extension of their technique. By making mild assumptions, we establish the theoretical convergence properties of our iterative method. Through
APA, Harvard, Vancouver, ISO, and other styles
25

Fadhilah, Nurul Hafawati, Mohd Rivaie, Fuziyah Ishak, and Nur Idalisa. "New Three-Term Conjugate Gradient Method with Exact Line Search." MATEMATIKA 36, no. 3 (2020): 197–207. http://dx.doi.org/10.11113/matematika.v36.n3.1214.

Full text
Abstract:
Conjugate Gradient (CG) methods have an important role in solving largescale unconstrained optimization problems. Nowadays, the Three-Term CG method hasbecome a research trend of the CG methods. However, the existing Three-Term CGmethods could only be used with the inexact line search. When the exact line searchis applied, this Three-Term CG method will be reduced to the standard CG method.Hence in this paper, a new Three-Term CG method that could be used with the exactline search is proposed. This new Three-Term CG method satisfies the descent conditionusing the exact line search. Performance
APA, Harvard, Vancouver, ISO, and other styles
26

Idalisa, Nur. "On Solving Unconstrained Optimization Using Three Term Conjugate Gradient Method." International Journal of Advanced Trends in Computer Science and Engineering 8, no. 1.5 (2019): 108–10. http://dx.doi.org/10.30534/ijatcse/2019/2181.52019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Liu, J. K., and S. J. Li. "New three-term conjugate gradient method with guaranteed global convergence." International Journal of Computer Mathematics 91, no. 8 (2014): 1744–54. http://dx.doi.org/10.1080/00207160.2013.862236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Andrei, Neculai. "A simple three-term conjugate gradient algorithm for unconstrained optimization." Journal of Computational and Applied Mathematics 241 (March 2013): 19–29. http://dx.doi.org/10.1016/j.cam.2012.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Andrei, Neculai. "A new three-term conjugate gradient algorithm for unconstrained optimization." Numerical Algorithms 68, no. 2 (2014): 305–21. http://dx.doi.org/10.1007/s11075-014-9845-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

黄, 玲花. "A Three-Term Conjugate Gradient Algorithm for Nonlinear Equations Problems." Operations Research and Fuzziology 07, no. 01 (2017): 31–36. http://dx.doi.org/10.12677/orf.2017.71004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Taqi, Abbas H. "Improved Three-term Conjugate Gradient Algorithm For Training Neural Network." Journal of Kufa for Mathematics and Computer 2, no. 3 (2015): 93–100. http://dx.doi.org/10.31642/jokmc/2018/020309.

Full text
Abstract:
A new three-term conjugate gradient algorithm for training feed-forward neural networks is developed. It is a vector based training algorithm derived from DFP quasi-Newton and has only O(n) memory. The global convergence to the proposed algorithm has been established for convex function under Wolfe condition. The results of numerical experiments are included and compared with other well known training algorithms in this field.
APA, Harvard, Vancouver, ISO, and other styles
32

Diphofu, T., P. Kaelo, and A. R. Tufa. "A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization." Topological Algebra and its Applications 10, no. 1 (2022): 47–60. http://dx.doi.org/10.1515/taa-2022-0112.

Full text
Abstract:
Abstract Conjugate gradient methods are very popular for solving large scale unconstrained optimization problems because of their simplicity to implement and low memory requirements. In this paper, we present a hybrid three-term conjugate gradient method with a direction that always satisfies the sufficient descent condition. We establish global convergence of the new method under the weak Wolfe line search conditions. We also report some numerical results of the proposed method compared to relevant methods in the literature.
APA, Harvard, Vancouver, ISO, and other styles
33

Ismail Ibrahim, Yahya, and Hisham Mohammed Khudhur. "Modified three-term conjugate gradient algorithm and its applications in image restoration." Indonesian Journal of Electrical Engineering and Computer Science 28, no. 3 (2022): 1510. http://dx.doi.org/10.11591/ijeecs.v28.i3.pp1510-1517.

Full text
Abstract:
In image restoration, the goal is often to bring back a high-quality version of an image from a lower-quality copy of it. In this article, we will investigate one kind of recovery issue, namely recovering photos that have been blurred by noise in digital photographs (sometimes known as "salt and pepper" noise). When subjected to noise at varying frequencies and intensities (30,50,70,90). In this paper, we used the conjugate gradient algorithm to Restorative images and remove noise from them, we developed the conjugate gradient algorithm with three limits using the conjugate condition of Dai an
APA, Harvard, Vancouver, ISO, and other styles
34

Hao, Yue, Shouqiang Du, and Yuanyuan Chen. "A new three-term conjugate gradient method for solving the finite minimax problems." Filomat 35, no. 3 (2021): 737–58. http://dx.doi.org/10.2298/fil2103737h.

Full text
Abstract:
In this paper, we consider the method for solving the finite minimax problems. By using the exponential penalty function to smooth the finite minimax problems, a new three-term nonlinear conjugate gradient method is proposed for solving the finite minimax problems, which generates sufficient descent direction at each iteration. Under standard assumptions, the global convergence of the proposed new three-term nonlinear conjugate gradient method with Armijo-type line search is established. Numerical results are given to illustrate that the proposed method can efficiently solve several kinds of o
APA, Harvard, Vancouver, ISO, and other styles
35

Dong, XiaoLiang, Deren Han, Zhifeng Dai, Lixiang Li, and Jianguang Zhu. "An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition." Journal of Optimization Theory and Applications 179, no. 3 (2018): 944–61. http://dx.doi.org/10.1007/s10957-018-1377-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Marwan S. Jameel. "Three Term Conjugate Gradient Technique and its Global Convergence based on the Zhang, Zhou and Li methods." Tikrit Journal of Pure Science 27, no. 2 (2022): 84–90. http://dx.doi.org/10.25130/tjps.v27i2.72.

Full text
Abstract:
The optimal conjugation coefficient distinguishes conjugate gradient methods such as two-term, three-term, and conditional from other descent methods. A novel conjugation parameter formula is constructed from Zhang, Zhou, and Li's well-known formula to formulate a three-term conjugation gradient method in the unconstrained optimization domain. The conjugation parameter and the third term parameter were constructed by incorporating the Perry conjugation condition into Shanno's memory-free strategy of a conjugate gradient. The approach demonstrated a steeply sloped search direction for each iter
APA, Harvard, Vancouver, ISO, and other styles
37

Tian, Qi, Xiaoliang Wang, Liping Pang, Mingkun Zhang, and Fanyun Meng. "A New Hybrid Three-Term Conjugate Gradient Algorithm for Large-Scale Unconstrained Problems." Mathematics 9, no. 12 (2021): 1353. http://dx.doi.org/10.3390/math9121353.

Full text
Abstract:
Three-term conjugate gradient methods have attracted much attention for large-scale unconstrained problems in recent years, since they have attractive practical factors such as simple computation, low memory requirement, better descent property and strong global convergence property. In this paper, a hybrid three-term conjugate gradient algorithm is proposed and it owns a sufficient descent property, independent of any line search technique. Under some mild conditions, the proposed method is globally convergent for uniformly convex objective functions. Meanwhile, by using the modified secant e
APA, Harvard, Vancouver, ISO, and other styles
38

Zaydan B. Mohammed, Nazar K. Hussein, and Zeyad M. Abdullah. "A modified three-term conjugate gradient method for large –scale optimization." Tikrit Journal of Pure Science 25, no. 3 (2020): 116–20. http://dx.doi.org/10.25130/tjps.v25i3.258.

Full text
Abstract:
We propose a three-term conjugate gradient method in this paper . The basic idea is to exploit the good properties of the BFGS update. Quasi – Newton method lies a good efficient numerical computational, so we suggested to be based on BFGS method. However, the descent condition and the global convergent is proven under Wolfe condition. The new algorithm is very effective e for solving the large – scale unconstrained optimization problem.
APA, Harvard, Vancouver, ISO, and other styles
39

Abdullah, Zeyad M., and Songul A. Asker. "A New Scaled Three-Term Conjugate Gradient Algorithms For Unconstrained Optimization." Tikrit Journal of Pure Science 28, no. 4 (2023): 103–10. http://dx.doi.org/10.25130/tjps.v28i4.1534.

Full text
Abstract:
Since optimization problems are getting more complicated, new ways to solve them must be thought of, or existing methods must be improved. In this research, we expand the different parameters of the three-term conjugate gradient method to work out unconstrained optimization problems. Our new CG approach meets the conditions of sufficient descent, and global convergence. In addition, we describe some numerical results that imply comparisons to relevant methodologies in the existing research literature.
APA, Harvard, Vancouver, ISO, and other styles
40

Hou, Yuqing, Zijian Tang, Huangjian Yi, Hongbo Guo, Jingjing Yu, and Xiaowei He. "Three-term conjugate gradient method for X-ray luminescence computed tomography." Journal of the Optical Society of America A 38, no. 7 (2021): 985. http://dx.doi.org/10.1364/josaa.423149.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Idalisa, Nur. "Computation and Performance Analysis of Existing Three Term Conjugate Gradient Method." International Journal of Advanced Trends in Computer Science and Engineering 8, no. 1.5 (2019): 91–95. http://dx.doi.org/10.30534/ijatcse/2019/1981.52019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Khalil K. Abbo Linda et al.,, Khalil K. Abbo Linda et al ,. "New Three-Term Conjugate Gradient Method for Solving Unconstrained Optimization Problems." International Journal of Mathematics and Computer Applications Research 7, no. 4 (2017): 23–32. http://dx.doi.org/10.24247/ijmcaraug20173.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

刘, 蕾. "A Stochastic Three-Term Conjugate Gradient Method for Unconstrained Optimization Problems." Advances in Applied Mathematics 11, no. 07 (2022): 4248–67. http://dx.doi.org/10.12677/aam.2022.117452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Liu, Jinkui, and Xuesha Wu. "New three-term conjugate gradient method for solving unconstrained optimization problems." ScienceAsia 40, no. 4 (2014): 295. http://dx.doi.org/10.2306/scienceasia1513-1874.2014.40.295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

廖, 若沙. "A New Three-Term Conjugate Gradient Method for Solving Nonlinear Equations." Advances in Applied Mathematics 08, no. 05 (2019): 869–75. http://dx.doi.org/10.12677/aam.2019.85097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Liu, J. K., Y. X. Zhao, and X. L. Wu. "Some three-term conjugate gradient methods with the new direction structure." Applied Numerical Mathematics 150 (April 2020): 433–43. http://dx.doi.org/10.1016/j.apnum.2019.10.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Babaie-Kafaki, S., and R. Ghanbari. "An extended three-term conjugate gradient method with sufficient descent property." Miskolc Mathematical Notes 16, no. 1 (2015): 45. http://dx.doi.org/10.18514/mmn.2015.1266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Zhang, Li, Weijun Zhou, and Donghui Li. "Some descent three-term conjugate gradient methods and their global convergence." Optimization Methods and Software 22, no. 4 (2007): 697–711. http://dx.doi.org/10.1080/10556780701223293.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Babaie-Kafaki, Saman, and Reza Ghanbari. "Two modified three-term conjugate gradient methods with sufficient descent property." Optimization Letters 8, no. 8 (2014): 2285–97. http://dx.doi.org/10.1007/s11590-014-0736-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Babaie-Kafaki, Saman. "A modified three–term conjugate gradient method with sufficient descent property." Applied Mathematics-A Journal of Chinese Universities 30, no. 3 (2015): 263–72. http://dx.doi.org/10.1007/s11766-015-3276-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!