To see the other types of publications on this topic, follow the link: Unconstraint optimization.

Journal articles on the topic 'Unconstraint optimization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Unconstraint optimization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Hatamlou, Abdolreza. "Numerical Optimization Using the Heart Algorithm." International Journal of Applied Evolutionary Computation 9, no. 2 (2018): 33–37. http://dx.doi.org/10.4018/ijaec.2018040103.

Full text
Abstract:
In this article the authors investigate the application of the heart algorithm for solving unconstraint numerical optimization problems. Heart algorithms are a novel optimization algorithm which mimics the heart function and circulatory system procedure in the human beings. It starts with a number of candidate solutions for the given problem and utilizes the contraction and expansion actions to move the candidates in the search space for finding optimal solution. The applicability and performance of the heart algorithm for solving unconstrained optimization problems has been tested using sever
APA, Harvard, Vancouver, ISO, and other styles
2

YU, GUOLIN. "OPTIMALITY OF GLOBAL PROPER EFFICIENCY FOR CONE-ARCWISE CONNECTED SET-VALUED OPTIMIZATION USING CONTINGENT EPIDERIVATIVE." Asia-Pacific Journal of Operational Research 30, no. 03 (2013): 1340004. http://dx.doi.org/10.1142/s0217595913400046.

Full text
Abstract:
This note deals with the optimality conditions of set-valued unconstraint optimization problem in real normed linear spaces. Based upon the concept of contingent epiderivative, the unified necessary and sufficient optimality conditions for global proper efficiency in vector optimization problem involving cone-arcwise connected set-valued mapping are presented.
APA, Harvard, Vancouver, ISO, and other styles
3

Prajapati, Raju, and Om Prakash Dubey. "ANALYSING THE IMPACT OF PENALTY CONSTANT ON PENALTY FUNCTION THROUGH PARTICE SWARM OPTIMIZATION." International Journal of Students' Research in Technology & Management 6, no. 2 (2018): 01–06. http://dx.doi.org/10.18510/ijsrtm.2018.621.

Full text
Abstract:
Non Linear Programming Problems (NLPP) are tedious to solve as compared to Linear Programming Problem (LPP). The present paper is an attempt to analyze the impact of penalty constant over the penalty function, which is used to solve the NLPP with inequality constraint(s). The improved version of famous meta heuristic Particle Swarm Optimization (PSO) is used for this purpose. The scilab programming language is used for computational purpose. The impact of penalty constant is studied by considering five test problems. Different values of penalty constant are taken to prepare the unconstraint NL
APA, Harvard, Vancouver, ISO, and other styles
4

Vo, Duc Thinh, and Ngoc Cam Huynh. "<span><strong>Subdifferentials with degrees of freedom and applications to optimization problems</strong></span>." Dong Thap University Journal of Science 14, no. 5 (2024): 12–19. https://doi.org/10.52714/dthu.14.5.2025.1401.

Full text
Abstract:
In this work, we first present a new class of generalized differentials, namely subdifferentials with degrees of freedom as well as their applications in nonsmooth optimization problems. We then establish some computation rules for subdifferentials with degree of frecdom of functions under basic qualification constraints. By using these computation rules, we provide necessary and sufficient conditions for unconstraint optmization problems and for optimization problems with geometric constraints.
APA, Harvard, Vancouver, ISO, and other styles
5

S. Younis, Maha, and Basma Tareq. "A New Hybrid Conjugates Gradient Algorithm for Unconstraint Optimization Problems." International Journal of Engineering Technology and Natural Sciences 4, no. 1 (2022): 81–94. http://dx.doi.org/10.46923/ijets.v4i1.148.

Full text
Abstract:
In this paper, we present a new hybrid conjugate gradient strategy that is both efficient and effective for solving unconstrained optimization problems. The parameter is derived from a convex combination of the and the conjugate gradient methods. We demonstrated that this strategy is globally convergent under strong Wolfe line search conditions, and that the recommended hybrid CG method is capable of creating a descending search direction at each iteration. Numerical results are presented in this study, demonstrating that the proposed technique is both efficient and promising.
APA, Harvard, Vancouver, ISO, and other styles
6

Jiang, Zhi Xia, Pin Chao Meng, Yan Zhong Li, and Wei Shi Yin. "Collaborative Optimization Algorithm Based on the Penalty Function." Applied Mechanics and Materials 538 (April 2014): 447–50. http://dx.doi.org/10.4028/www.scientific.net/amm.538.447.

Full text
Abstract:
The paper discusses the collaborative optimization problems with bounded. Based the penalty function the system-level optimization convert to a unconstraint programming. To the discipline-level optimization, the normalized weighted coefficients are used and combine relaxation factors to solve. It uses the relaxation factor to expand the feasible region, and possibly makes the iteration in the calculation process run inside feasible region. The data have shown that the algorithm has expanded the choice range of the initial points with high calculation accuracy and better algorithm stability.
APA, Harvard, Vancouver, ISO, and other styles
7

Solaimani, S., and P. Arul. "Unconstraint Optimal Power Flow using Improved Cuckoo Search Algorithm." International Journal of Advance Research and Innovation 5, no. 2 (2017): 102–7. http://dx.doi.org/10.51976/ijari.521718.

Full text
Abstract:
This paper presents an efficient and reliable a swarm-Intelligence based algorithm and bio-Inspired algorithm approach to unconstraint obtain optimal power flow (OPF) problem solution. This approach employs a nature inspired meta-heuristics optimization algorithm such as improved cuckoo search algorithm to determine the optimal setting of control variable. The performance of the improved cuckoo search algorithm (ICS) is examined and tested on IEEE 30 bus test system with objective function is minimization of fuel cost. The solution is done using MATLAB software.
APA, Harvard, Vancouver, ISO, and other styles
8

Hassan, Basim A. "A New Hybrid Conjugate Gradient Method with Guaranteed Descent for Unconstraint Optimization." Al-Mustansiriyah Journal of Science 28, no. 3 (2018): 193. http://dx.doi.org/10.23851/mjs.v28i3.114.

Full text
Abstract:
The conjugate gradient method an efficient technique for solving the unconstrained optimization problem. In this paper, we propose a new hybrid nonlinear conjugate gradient methods, which have the descent at every iteration and globally convergence properties under certain conditions. The numerical results show that new hybrid method are efficient for the given test problems.
APA, Harvard, Vancouver, ISO, and other styles
9

Hady, Mohammed M. Abdel, and Maha S. Younis. "New Parameter of CG-Method with Exact Line Search for Unconstraint Optimization." OALib 07, no. 04 (2020): 1–8. http://dx.doi.org/10.4236/oalib.1106236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bayati. "New Scaled Sufficient Descent Conjugate Gradient Algorithm for Solving Unconstraint Optimization Problems." Journal of Computer Science 6, no. 5 (2010): 511–18. http://dx.doi.org/10.3844/jcssp.2010.511.518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Ali, A. Al-Arbo, and Z. Al-Kawaz Rana. "Implementation of combined new optimal cuckoo algorithm with a gray wolf algorithm to solve unconstrained optimization nonlinear problems." Indonesian Journal of Electrical Engineering and Computer Science 19, no. 3 (2020): 1582–89. https://doi.org/10.11591/ijeecs.v19.i3.pp1582-1589.

Full text
Abstract:
In this article, a combined optimization algorithm was proposed which combines the optimal adaptive Cuckoo algorithm (OACS) which is a Natureinspired algorithm with a Gray Wolf optimizer algorithm (GWO). Sometimes considering the cuckoo algorithm alone, it may fail to find the local minimum-point and also fails to reach the solution because of the slow speed of its convergence property. Therefore, considering the new proposed adaptive combined algorithm gave a strong improvement for using this to reach the minimum point in solving (12) nonlinear test problems. This is suitable to solve a large
APA, Harvard, Vancouver, ISO, and other styles
12

Taishan, Yan, and Cui Duwu. "Research on the Algorithm for Solving Unconstraint Optimization Problems Utilizing Knowledge Evolution Principle." Information Technology Journal 9, no. 2 (2010): 343–48. http://dx.doi.org/10.3923/itj.2010.343.348.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Al-Arabo, Ali Abbas, and Rana Zaidan Alkawaz. "Implementation of combined new optimal cuckoo algorithm with a gray wolf algorithm to solve unconstrained optimization nonlinear problems." Indonesian Journal of Electrical Engineering and Computer Science 19, no. 3 (2020): 1582. http://dx.doi.org/10.11591/ijeecs.v19.i3.pp1582-1589.

Full text
Abstract:
&lt;p&gt;In this article, a combined optimization algorithm was proposed which combines the optimal adaptive Cuckoo algorithm (OACS) which is Nature-inspired algorithm with Gray Wolf optimizer algorithm (GWO). Sometimes considering the cuckoo algorithm alone, may fail to find the local minimum-point and also fails to reach to the solution because of the slow speed of its convergence property. Therefore, considering the new proposed adaptive combined algorithm gave a strong improvement for using this to reach the minimum point in solving (23) nonlinear test problems. This is suitable to solve a
APA, Harvard, Vancouver, ISO, and other styles
14

Wang, Xing Bin. "The Simulator Research of the Traffic Network Particle Swarm Optimization Based on Wireless Sensor." Advanced Materials Research 926-930 (May 2014): 2638–41. http://dx.doi.org/10.4028/www.scientific.net/amr.926-930.2638.

Full text
Abstract:
To solve the problem of optimizing node deployment of wireless sensor network for urban traffic informat ion acquisit ion, a constraint optimization model for wireless sensor network node deployment was proposed. Both the comprehensive evaluation function for connectivity and coverage and the restriction on the practical demands of connectivity and coverage are used. The constraint optimization model is converted to unconstraint one using penalty function. The particle swarm optimization algorithm is used to solve the problem. The dynamically changing weight method is used as an improved algor
APA, Harvard, Vancouver, ISO, and other styles
15

Fathy, Basma T., and Maha S. Younis. "Global Convergence Analysis of a new Hybrid Conjugate Gradient Method for Unconstraint Optimization Problems." Journal of Physics: Conference Series 2322, no. 1 (2022): 012063. http://dx.doi.org/10.1088/1742-6596/2322/1/012063.

Full text
Abstract:
Abstract In this study, we introduce a novel hybrid conjugate gradient [CG] to solve an efficient and effective unconstrained optimization problem. The parameter θk is a convex combination of the conjugate gradient method β k B A 2 and β k F R . Under strong Wolfe line search conditions (SWC), we have shown that this method is globally convergent, and the proposed hybrid CG method is able to generate a descending search direction at each iteration. The numerical results presented, in this paper demonstrate that the proposed strategy is both effective and promising.
APA, Harvard, Vancouver, ISO, and other styles
16

Haltmeier, Markus, Housen Li, and Axel Munk. "A Variational View on Statistical Multiscale Estimation." Annual Review of Statistics and Its Application 9, no. 1 (2022): 343–72. http://dx.doi.org/10.1146/annurev-statistics-040120-030531.

Full text
Abstract:
We present a unifying view on various statistical estimation techniques including penalization, variational, and thresholding methods. These estimators are analyzed in the context of statistical linear inverse problems including nonparametric and change point regression, and high-dimensional linear models as examples. Our approach reveals many seemingly unrelated estimation schemes as special instances of a general class of variational multiscale estimators, called MIND (multiscale Nemirovskii–Dantzig). These estimators result from minimizing certain regularization functionals under convex con
APA, Harvard, Vancouver, ISO, and other styles
17

Khan, Taimur, Muhammad Khattak, and Adnan Tariq. "Radiation Pattern Synthesis and Mutual Coupling Compensation in Spherical Conformal Array Antennas." Applied Computational Electromagnetics Society 36, no. 6 (2021): 707–17. http://dx.doi.org/10.47037/2020.aces.j.360612.

Full text
Abstract:
This paper presents a novel technique based on Hybrid Spatial Distance Reduction Algorithm (HSDRA), to compensate the effects of deformity and mutual coupling occurred due to surface change in conformal arrays. This antenna surface deformation shifts the position of null points and loss of the main beam resulting in reduced antenna gain along with substantial undesirable effects on the antenna performance. The proposed algorithm, which cumulatively incorporates the Linearly Constraint Least Square Optimization (LCLSO) and Quadratically Constraint Least Square Optimization (QCLSO) techniques, i
APA, Harvard, Vancouver, ISO, and other styles
18

Baba, Ishaq, Habshah Midi, Sohel Rana, and Gafurjan Ibragimov. "An Alternative Approach of Dual Response Surface Optimization Based on Penalty Function Method." Mathematical Problems in Engineering 2015 (2015): 1–6. http://dx.doi.org/10.1155/2015/450131.

Full text
Abstract:
The dual response surface for simultaneously optimizing the mean and variance models as separate functions suffers some deficiencies in handling the tradeoffs between bias and variance components of mean squared error (MSE). In this paper, the accuracy of the predicted response is given a serious attention in the determination of the optimum setting conditions. We consider four different objective functions for the dual response surface optimization approach. The essence of the proposed method is to reduce the influence of variance of the predicted response by minimizing the variability relati
APA, Harvard, Vancouver, ISO, and other styles
19

Shenglin Liu and Gongchao Wan. "Adaptive Customization of Electronic Commerce Packaging for Sustainable Business Development." Journal of Modern Social Sciences 2, no. 1 (2025): 56–64. https://doi.org/10.71113/jmss.v2i1.158.

Full text
Abstract:
To address the growing demand for sustainable practices in e-commerce logistics, this research explores the innovative application of the NSGA-II algorithm for customized packaging optimization in distribution. A novel unconstraint mixed-integer linear programming mathematical model was developed and integrated with the NSGA-II algorithm to optimize packaging design dimensions and material properties. The approach emphasizes flexibility, compressibility, and adaptability to achieve an optimal balance between resource efficiency and product protection. Through rigorous simulation experiments, t
APA, Harvard, Vancouver, ISO, and other styles
20

WANG, TAO, XIAOLIANG XING, and XINHUA ZHUANG. "CHARACTERIZING ONE-LAYER ASSOCIATIVE NEURAL NETWORKS WITH OPTIMAL NOISE-REDUCTION ABILITY." International Journal of Pattern Recognition and Artificial Intelligence 06, no. 05 (1992): 1009–25. http://dx.doi.org/10.1142/s0218001492000497.

Full text
Abstract:
In this paper, we describe an optimal learning algorithm for designing one-layer neural networks by means of global minimization. Taking the properties of a well-defined neural network into account, we derive a cost function to measure the goodness of the network quantitatively. The connection weights are determined by the gradient descent rule to minimize the cost function. The optimal learning algorithm is formed as either the unconstraint-based or the constraint-based minimization problem. It ensures the realization of each desired associative mapping with the best noise reduction ability i
APA, Harvard, Vancouver, ISO, and other styles
21

Fan, Shaohua, Shuyang Zhang, Xiao Wang, and Chuan Shi. "Directed Acyclic Graph Structure Learning from Dynamic Graphs." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 6 (2023): 7512–21. http://dx.doi.org/10.1609/aaai.v37i6.25913.

Full text
Abstract:
Estimating the structure of directed acyclic graphs (DAGs) of features (variables) plays a vital role in revealing the latent data generation process and providing causal insights in various applications. Although there have been many studies on structure learning with various types of data, the structure learning on the dynamic graph has not been explored yet, and thus we study the learning problem of node feature generation mechanism on such ubiquitous dynamic graph data. In a dynamic graph, we propose to simultaneously estimate contemporaneous relationships and time-lagged interaction relat
APA, Harvard, Vancouver, ISO, and other styles
22

Narushima, Yasushi. "A NONMONOTONE MEMORY GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION." Journal of the Operations Research Society of Japan 50, no. 1 (2007): 31–45. http://dx.doi.org/10.15807/jorsj.50.31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Basheer M. Salih, Khalil K. Abbo, and Zeyad M. Abdullah. "Partial Davidon, Fletcher and Powell (DFP) of quasi newton method for unconstrained optimization." Tikrit Journal of Pure Science 21, no. 6 (2023): 180–86. http://dx.doi.org/10.25130/tjps.v21i6.1099.

Full text
Abstract:
The nonlinear Quasi-newton methods is widely used in unconstrained optimization. However, In this paper, we developing new quasi-Newton method for solving unconstrained optimization problems. We consider once quasi-Newton which is (DFP) update formula, namely, Partial DFP. Most of quasi-Newton methods don't&#x0D; always generate a descent search directions, so the descent or sufficient descent condition is usually assumed in the analysis and implementations . Descent property for the suggested method is proved. Finally, the numerical results show that the new method is also very efficient for
APA, Harvard, Vancouver, ISO, and other styles
24

Basheer M. Salih, Khalil K. Abbo, and Zeyad M. Abdullah. "Partial Pearson-two (PP2) of quasi newton method for unconstrained optimization." Tikrit Journal of Pure Science 21, no. 3 (2023): 174–79. http://dx.doi.org/10.25130/tjps.v21i3.1012.

Full text
Abstract:
In this paper, we developing new quasi-Newton method for solving unconstrained optimization problems .The nonlinear Quasi-newton methods is widely used in unconstrained optimization[1]. However,. We consider once quasi-Newton which is (Pearson-two) update formula [2], namely, Partial P2. Most of quasi-Newton methods don't always generate a descent search directions, so the descent or sufficient descent condition is usually assumed in the analysis and implementations [3] . Descent property for the suggested method is proved. Finally, the numerical results show that the new method is also very e
APA, Harvard, Vancouver, ISO, and other styles
25

Yabe, Hiroshi, and Naoki Sakaiwa. "A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION." Journal of the Operations Research Society of Japan 48, no. 4 (2005): 284–96. http://dx.doi.org/10.15807/jorsj.48.284.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Celik, Yuksel, and Erkan Ulker. "An Improved Marriage in Honey Bees Optimization Algorithm for Single Objective Unconstrained Optimization." Scientific World Journal 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/370172.

Full text
Abstract:
Marriage in honey bees optimization (MBO) is a metaheuristic optimization algorithm developed by inspiration of the mating and fertilization process of honey bees and is a kind of swarm intelligence optimizations. In this study we propose improved marriage in honey bees optimization (IMBO) by adding Levy flight algorithm for queen mating flight and neighboring for worker drone improving. The IMBO algorithm’s performance and its success are tested on the well-known six unconstrained test functions and compared with other metaheuristic optimization algorithms.
APA, Harvard, Vancouver, ISO, and other styles
27

He, Jianjia, and Fuyuan Xu. "Chaotic-Search-Based Cultural Algorithm for Solving Unconstrained Optimization Problem." Modelling and Simulation in Engineering 2011 (2011): 1–6. http://dx.doi.org/10.1155/2011/239743.

Full text
Abstract:
For premature convergence and instability of cultural algorithm in solving function optimization problem, based on cultural algorithm and chaos search optimization, a chaos cultural algorithm (CCA) is proposed. The algorithm model consists of a chaos-based population space and a knowledge-storing belief space, uses normative knowledge and situational knowledge for chaos search and chaos perturbation, respectively, effectively avoids premature convergence of cultural algorithm, and overcomes chaos search optimization's sensitivity to initial values and poor efficiency. Test results show that th
APA, Harvard, Vancouver, ISO, and other styles
28

A. Hassan, Basim, and Hawraz N. Jabbar. "A New Spectral on the Gradient Methods for Unconstrained Optimization Minimization Problem." Journal of Zankoy Sulaimani - Part A 22, no. 2 (2020): 217–24. http://dx.doi.org/10.17656/jzs.10822.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Xiong, Zhe, Xiao-Hui Li, Jing-Chang Liang, and Li-Juan Li. "A Multi-Objective Hybrid Algorithm for Optimization of Grid Structures." International Journal of Applied Mechanics 10, no. 01 (2018): 1850009. http://dx.doi.org/10.1142/s1758825118500096.

Full text
Abstract:
In this study, a novel multi-objective hybrid algorithm (MHGH, multi-objective HPSO-GA hybrid algorithm) is developed by crossing the heuristic particle swarm optimization (HPSO) algorithm with a genetic algorithm (GA) based on the concept of Pareto optimality. To demonstrate the effectiveness of the MHGH, the optimizations of four unconstrained mathematical functions and four constrained truss structural problems are tested and compared to the results using several other classic algorithms. The results show that the MHGH improves the convergence rate and precision of the particle swarm optimi
APA, Harvard, Vancouver, ISO, and other styles
30

Hassan, Basim A., Kanikar Muangchoo, Fadhil Alfarag, Abdulkarim Hassan Ibrahim, and Auwal Bala Abubakar. "An improved quasi-Newton equation on the quasi-Newton methods for unconstrained optimizations." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 2 (2021): 997–1005. https://doi.org/10.11591/ijeecs.v22.i2.pp997-1005.

Full text
Abstract:
Quasi-Newton methods are a class of numerical methods for solving the problem of unconstrained optimization. To improve the overall efficiency of resulting algorithms, we use the quasi-Newton methods which is interesting for quasi-Newton equation. In this manuscript, we present a modified BFGS update formula based on the new quasi-Newton equation, which give a new search direction for solving unconstrained optimizations proplems. We analyse the convergence rate of quasi-Newton method under some mild condition. Numerical experiments are conducted to demonstrate the efficiency of new methods usi
APA, Harvard, Vancouver, ISO, and other styles
31

Adil Hashmi, Adil Hashmi. "Firefly Algorithm for Unconstrained Optimization." IOSR Journal of Computer Engineering 11, no. 1 (2013): 75–78. http://dx.doi.org/10.9790/0661-1117578.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Corradi, Gianfranco. "An algorithm for unconstrained optimization." International Journal of Computer Mathematics 45, no. 1-2 (1992): 123–31. http://dx.doi.org/10.1080/00207169208804122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Demidenko, E. "Criteria for Unconstrained Global Optimization." Journal of Optimization Theory and Applications 136, no. 3 (2007): 375–95. http://dx.doi.org/10.1007/s10957-007-9298-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Kanzow, C. "Nonlinear complementarity as unconstrained optimization." Journal of Optimization Theory and Applications 88, no. 1 (1996): 139–55. http://dx.doi.org/10.1007/bf02192026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Riseth, Asbjørn Nilsen. "Objective acceleration for unconstrained optimization." Numerical Linear Algebra with Applications 26, no. 1 (2018): e2216. http://dx.doi.org/10.1002/nla.2216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Yu, Xinghuo, Weixing Zheng, Baolin Wu, and Xin Yao. "A Novel Penalty Function Approach to Constrained Optimization Problems with Genetic Algorithms." Journal of Advanced Computational Intelligence and Intelligent Informatics 2, no. 6 (1998): 208–13. http://dx.doi.org/10.20965/jaciii.1998.p0208.

Full text
Abstract:
In this paper, a novel penalty function approach is proposed for constrained optimization problems with linear and nonlinear constraints. It is shown that by using a mapping function to "wrap" up the constraints, a constrained optimization problem can be converted to an unconstrained optimization problem. It is also proved mathematically that the best solution of the converted unconstrained optimization problem will approach the best solution of the constrained optimization problem if the tuning parameter for the wrapping function approaches zero. A tailored genetic algorithm incorporating an
APA, Harvard, Vancouver, ISO, and other styles
37

Jabbar, Hawraz N., Ali A. Al-Arbo, Yoksal A. Laylani, Hisham M. Khudhur, and Basim Abbas Hassan. "New conjugate gradient methods based on the modified secant condition." Journal of Interdisciplinary Mathematics 28, no. 1 (2025): 19–30. https://doi.org/10.47974/jim-1771.

Full text
Abstract:
Conjugate gradient (CG) algorithms are an optimization method with fast convergence. To date, several CG methods have been developed to improve computational performance and have been used to solve unconstrained optimization problems. In this paper, a new CG algorithm based on the modified secant condition is proposed to solve unconstrained optimization problems. The proposed algorithm has the following properties:it achieves sufficient descent property and global convergence property, We used a number of test functions for large-scale unconstrained optimization problems and the numbers showed
APA, Harvard, Vancouver, ISO, and other styles
38

Hassan, Basim Abbas, Kanikar Muangchoo, Fadhil Alfarag, Abdulkarim Hassan Ibrahim, and Auwal Bala Abubakar. "An improved quasi-Newton equation on the quasi-Newton methods for unconstrained optimizations." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 2 (2021): 997. http://dx.doi.org/10.11591/ijeecs.v22.i2.pp997-1005.

Full text
Abstract:
&lt;span&gt;&lt;span&gt;Quasi-Newton methods are a class of numerical methods for &lt;/span&gt;solving the problem of unconstrained optimization. To improve the overall efficiency of resulting algorithms, we use the quasi-Newton methods which is interesting for quasi-Newton equation. In this manuscript, we present a modified BFGS update formula based on the new quasi-Newton equation, which give a new search direction for solving unconstrained optimizations proplems. We analyse the convergence rate of quasi-Newton method under some mild condition. Numerical experiments are conducted to demonstr
APA, Harvard, Vancouver, ISO, and other styles
39

Ibrahim, Abdelmonem M., and Mohamed A. Tawhid. "A hybridization of differential evolution and monarch butterfly optimization for solving systems of nonlinear equations." Journal of Computational Design and Engineering 6, no. 3 (2018): 354–67. http://dx.doi.org/10.1016/j.jcde.2018.10.006.

Full text
Abstract:
Abstract In this study, we propose a new hybrid algorithm consisting of two meta-heuristic algorithms; Differential Evolution (DE) and the Monarch Butterfly Optimization (MBO). This hybrid is called DEMBO. Both of the meta-heuristic algorithms are typically used to solve nonlinear systems and unconstrained optimization problems. DE is a common metaheuristic algorithm that searches large areas of candidate space. Unfortunately, it often requires more significant numbers of function evaluations to get the optimal solution. As for MBO, it is known for its time-consuming fitness functions, but it
APA, Harvard, Vancouver, ISO, and other styles
40

Bai, Bin, Zhi-wei Guo, Qi-liang Wu, Junyi Zhang, and Yan-chao Cui. "Application of the Improved PSO-Based Extended Domain Method in Engineering." Mathematical Problems in Engineering 2020 (September 7, 2020): 1–14. http://dx.doi.org/10.1155/2020/2846181.

Full text
Abstract:
The standard particle swarm optimization (PSO) algorithm is the boundary constraints of simple variables, which can hardly be directly applied in the constrained optimization. Furthermore, the standard PSO algorithm often fails to obtain the global optimal solution when the dimensionality is high for unconstrained optimization. Thus, an improved PSO-based extended domain method (IPSO-EDM) is proposed to solve engineering optimization problems. The core idea of this method is that the original feasible region is expanded in the constrained optimization which is transformed into the unconstraine
APA, Harvard, Vancouver, ISO, and other styles
41

Zhu, Yajie, and Mingchuan Zhang. "An Adaptive Variance Reduction Zeroth-Order Algorithm for Finite-Sum Optimization." Frontiers in Computing and Intelligent Systems 3, no. 3 (2023): 66–70. http://dx.doi.org/10.54097/fcis.v3i3.8568.

Full text
Abstract:
The unconstrained finite-sum optimization problem is a common type of problem in the field of optimization, and there is currently limited research on zeroth-order optimization algorithms. To solve unconstrained finite-sum optimization problems for non-convex function, we propose a zeroth-order optimization algorithm with adaptive variance reduction, called ZO-AdaSPIDER for short. Then, we analyze the convergence performance of the algorithm. The theoretical results show that ZO-AdaSPIDER algorithm can converge to -stationary point when facing non-convex function, and its convergence rate is .
APA, Harvard, Vancouver, ISO, and other styles
42

Zeyad M. Abdullah, Hameed M, Sadeq, Hisham M, Azzam, and Mundher A. Khaleel. "Modified new conjugate gradient method for Unconstrained Optimization." Tikrit Journal of Pure Science 24, no. 5 (2019): 86–90. http://dx.doi.org/10.25130/tjps.v24i5.422.

Full text
Abstract:
The current paper modified method of conjugate gradient for solving problems of unconstrained optimization. The modified method convergence is achieved by assuming some hypotheses. The statistical results demonstrate that the modified method is efficient for solving problems of Unconstrained Nonlinear Optimization in comparison with methods FR and HS
APA, Harvard, Vancouver, ISO, and other styles
43

Chen, Yuan-Yuan, and Shou-Qiang Du. "Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search." Abstract and Applied Analysis 2013 (2013): 1–5. http://dx.doi.org/10.1155/2013/742815.

Full text
Abstract:
Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems. Under some mild assumptions, the global convergence results of the given methods are proposed. The numerical results show that the nonlinear conjugate gradient methods with Wolfe type line search are efficient for some unconstrained optimization problems.
APA, Harvard, Vancouver, ISO, and other styles
44

Izmailov, Alexey, Alexey Kurennoy, and Petr Stetsyuk. "Levenberg–Marquardt method for unconstrained optimization." Tambov University Reports. Series: Natural and Technical Sciences, no. 125 (2019): 60–74. http://dx.doi.org/10.20310/1810-0198-2019-24-125-60-74.

Full text
Abstract:
We propose and study the Levenberg–Marquardt method globalized by means of linesearch for unconstrained optimization problems with possibly nonisolated solutions. It is well-recognized that this method is an efficient tool for solving systems of nonlinear equations, especially in the presence of singular and even nonisolated solutions. Customary globalization strategies for the Levenberg–Marquardt method rely on linesearch for the squared Euclidean residual of the equation being solved. In case of unconstrained optimization problem, this equation is formed by putting the gradient of the object
APA, Harvard, Vancouver, ISO, and other styles
45

Abdulelah Hussein, Wadhah, and Huda Amer Abdul Ameer. "Unconstrained Optimization of Univalent Harmonic Functions." Diyala Journal for Pure Science 18, no. 3 (2022): 7–27. http://dx.doi.org/10.24237/djps.1803.583b.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Shakir Mahmood, Saad, Ali Ibrahim Mansour, and Haydir Ali Hassan. "The Diagonal Update for Unconstrained Optimization." JOURNAL OF EDUCATION AND SCIENCE 25, no. 3 (2012): 68–73. http://dx.doi.org/10.33899/edusj.2012.59199.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Verma, Amit, and Mark Lewis. "Goal seeking Quadratic Unconstrained Binary Optimization." Results in Control and Optimization 7 (June 2022): 100125. http://dx.doi.org/10.1016/j.rico.2022.100125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Dorfling, Johann, and Kamran Rokhsaz. "Constrained and Unconstrained Propeller Blade Optimization." Journal of Aircraft 52, no. 4 (2015): 1179–88. http://dx.doi.org/10.2514/1.c032859.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Krejić, Nataša, and Nataša Krklec Jerinkić. "STOCHASTIC GRADIENT METHODS FOR UNCONSTRAINED OPTIMIZATION." Pesquisa Operacional 34, no. 3 (2014): 373–93. http://dx.doi.org/10.1590/0101-7438.2014.034.03.0373.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Ben-Tal, A., A. Melman, and J. Zowe. "Curved search methods for unconstrained optimization." Optimization 21, no. 5 (1990): 669–95. http://dx.doi.org/10.1080/02331939008843594.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!