Academic literature on the topic 'Fletcher Reeves conjugate gradient (CGFR)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Fletcher Reeves conjugate gradient (CGFR).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Fletcher Reeves conjugate gradient (CGFR)"

1

Lyn Dee, Goh, Norhisham Bakhary, Azlan Abdul Rahman, and Baderul Hisham Ahmad. "A Comparison of Artificial Neural Network Learning Algorithms for Vibration-Based Damage Detection." Advanced Materials Research 163-167 (December 2010): 2756–60. http://dx.doi.org/10.4028/www.scientific.net/amr.163-167.2756.

Full text
Abstract:
This paper investigates the performance of Artificial Neural Network (ANN) learning algorithms for vibration-based damage detection. The capabilities of six different learning algorithms in detecting damage are studied and their performances are compared. The algorithms are Levenberg-Marquardt (LM), Resilient Backpropagation (RP), Scaled Conjugate Gradient (SCG), Conjugate Gradient with Powell-Beale Restarts (CGB), Polak-Ribiere Conjugate Gradient (CGP) and Fletcher-Reeves Conjugate Gradient (CGF) algorithms. The performances of these algorithms are assessed based on their generalisation capability in relating the vibration parameters (frequencies and mode shapes) with damage locations and severities under various numbers of input and output variables. The results show that Levenberg-Marquardt algorithm provides the best generalisation performance.
APA, Harvard, Vancouver, ISO, and other styles
2

Mazloom, Mohammad Sadegh, Farzaneh Rezaei, Abdolhossein Hemmati-Sarapardeh, Maen M. Husein, Sohrab Zendehboudi, and Amin Bemani. "Artificial Intelligence Based Methods for Asphaltenes Adsorption by Nanocomposites: Application of Group Method of Data Handling, Least Squares Support Vector Machine, and Artificial Neural Networks." Nanomaterials 10, no. 5 (2020): 890. http://dx.doi.org/10.3390/nano10050890.

Full text
Abstract:
Asphaltenes deposition is considered a serious production problem. The literature does not include enough comprehensive studies on adsorption phenomenon involved in asphaltenes deposition utilizing inhibitors. In addition, effective protocols on handling asphaltenes deposition are still lacking. In this study, three efficient artificial intelligent models including group method of data handling (GMDH), least squares support vector machine (LSSVM), and artificial neural network (ANN) are proposed for estimating asphaltenes adsorption onto NiO/SAPO-5, NiO/ZSM-5, and NiO/AlPO-5 nanocomposites based on a databank of 252 points. Variables influencing asphaltenes adsorption include pH, temperature, amount of nanocomposites over asphaltenes initial concentration (D/C0), and nanocomposites characteristics such as BET surface area and volume of micropores. The models are also optimized using nine optimization techniques, namely coupled simulated annealing (CSA), genetic algorithm (GA), Bayesian regularization (BR), scaled conjugate gradient (SCG), ant colony optimization (ACO), Levenberg–Marquardt (LM), imperialistic competitive algorithm (ICA), conjugate gradient with Fletcher-Reeves updates (CGF), and particle swarm optimization (PSO). According to the statistical analysis, the proposed RBF-ACO and LSSVM-CSA are the most accurate approaches that can predict asphaltenes adsorption with average absolute percent relative errors of 0.892% and 0.94%, respectively. The sensitivity analysis shows that temperature has the most impact on asphaltenes adsorption from model oil solutions.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhu, Hongfei, Jorge Leandro, and Qing Lin. "Optimization of Artificial Neural Network (ANN) for Maximum Flood Inundation Forecasts." Water 13, no. 16 (2021): 2252. http://dx.doi.org/10.3390/w13162252.

Full text
Abstract:
Flooding is the world’s most catastrophic natural event in terms of losses. The ability to forecast flood events is crucial for controlling the risk of flooding to society and the environment. Artificial neural networks (ANN) have been adopted in recent studies to provide fast flood inundation forecasts. In this paper, an existing ANN trained based on synthetic events was optimized in two directions: extending the training dataset with the use of hybrid dataset, and selection of the best training function based on six possible functions, namely conjugate gradient backpropagation with Fletcher–Reeves updates (CGF) with Polak–Ribiére updates (CGP) and Powell–Beale restarts (CGB), one-step secant back-propagation (OSS), resilient backpropagation (RP), and scaled conjugate gra-dient backpropagation (SCG). Four real flood events were used to validate the performance of the improved ANN over the existing one. The new training dataset reduced the model’s rooted mean square error (RMSE) by 10% for the testing dataset and 16% for the real events. The selection of the resilient backpropagation algorithm contributed to 15% lower RMSE for the testing dataset and up to 35% for the real events when compared with the other five training functions.
APA, Harvard, Vancouver, ISO, and other styles
4

Djordjevic, Snezana. "New hybrid conjugate gradient method as a convex combination of FR and PRP methods." Filomat 30, no. 11 (2016): 3083–100. http://dx.doi.org/10.2298/fil1611083d.

Full text
Abstract:
We consider a newhybrid conjugate gradient algorithm,which is obtained fromthe algorithmof Fletcher-Reeves, and the algorithmof Polak-Ribi?re-Polyak. Numerical comparisons show that the present hybrid conjugate gradient algorithm often behaves better than some known algorithms.
APA, Harvard, Vancouver, ISO, and other styles
5

Kaelo, Pro, Sindhu Narayanan, and M. V. Thuto. "A modified quadratic hybridization of Polak-Ribiere-Polyak and Fletcher-Reeves conjugate gradient method for unconstrained optimization problems." An International Journal of Optimization and Control: Theories & Applications (IJOCTA) 7, no. 2 (2017): 177–85. http://dx.doi.org/10.11121/ijocta.01.2017.00339.

Full text
Abstract:
This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, C. Y., and M. X. Li. "Convergence property of the Fletcher-Reeves conjugate gradient method with errors." Journal of Industrial & Management Optimization 1, no. 2 (2005): 193–200. http://dx.doi.org/10.3934/jimo.2005.1.193.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

ZENG, MEILAN, and GUANGHUI ZHOU. "A MODIFIED FR CONJUGATE GRADIENT METHOD FOR COMPUTING -EIGENPAIRS OF SYMMETRIC TENSORS." Bulletin of the Australian Mathematical Society 94, no. 3 (2016): 411–20. http://dx.doi.org/10.1017/s0004972716000381.

Full text
Abstract:
This paper proposes improvements to the modified Fletcher–Reeves conjugate gradient method (FR-CGM) for computing $Z$-eigenpairs of symmetric tensors. The FR-CGM does not need to compute the exact gradient and Jacobian. The global convergence of this method is established. We also test other conjugate gradient methods such as the modified Polak–Ribière–Polyak conjugate gradient method (PRP-CGM) and shifted power method (SS-HOPM). Numerical experiments of FR-CGM, PRP-CGM and SS-HOPM show the efficiency of the proposed method for finding $Z$-eigenpairs of symmetric tensors.
APA, Harvard, Vancouver, ISO, and other styles
8

Pang, Deyan, Shouqiang Du, and Jingjie Ju. "The smoothing Fletcher-Reeves conjugate gradient method for solving finite minimax problems." ScienceAsia 42, no. 1 (2016): 40. http://dx.doi.org/10.2306/scienceasia1513-1874.2016.42.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alshorman, Omar, Mustafa Mamat, Ahmad Alhawarat, and Mohd Revaie. "A modifications of conjugate gradient method for unconstrained optimization problems." International Journal of Engineering & Technology 7, no. 2.14 (2018): 21. http://dx.doi.org/10.14419/ijet.v7i2.14.11146.

Full text
Abstract:
The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.
APA, Harvard, Vancouver, ISO, and other styles
10

Sellami, Badreddine, and Mohamed Chiheb Eddine Sellami. "Global convergence of a modified Fletcher–Reeves conjugate gradient method with Wolfe line search." Asian-European Journal of Mathematics 13, no. 04 (2019): 2050081. http://dx.doi.org/10.1142/s1793557120500813.

Full text
Abstract:
In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. we propose a modified Fletcher–Reeves (abbreviated FR) [Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154] conjugate gradient algorithm satisfying a parametrized sufficient descent condition with a parameter [Formula: see text] is proposed. The parameter [Formula: see text] is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel (abbreviated HS) [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952) 409–436] algorithm is obtained, which produces a descent search direction at every iteration that the line search satisfies the Wolfe conditions. Under appropriate conditions, we show that the modified FR method with the strong Wolfe line search is globally convergent of uniformly convex functions. We also present extensive preliminary numerical experiments to show the efficiency of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography