To see the other types of publications on this topic, follow the link: Continuous global optimization problems.

Journal articles on the topic 'Continuous global optimization problems'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Continuous global optimization problems.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Vinkó, Tamás, and Kitti Gelle. "Basin Hopping Networks of continuous global optimization problems." Central European Journal of Operations Research 25, no. 4 (2017): 985–1006. http://dx.doi.org/10.1007/s10100-017-0480-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dua, V., K. P. Papalexandri, and E. N. Pistikopoulos. "Global Optimization Issues in Multiparametric Continuous and Mixed-Integer Optimization Problems." Journal of Global Optimization 30, no. 1 (2004): 59–89. http://dx.doi.org/10.1023/b:jogo.0000049091.73047.7e.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ji, Mingjun, and Jacek Klinowski. "Convergence of taboo search in continuous global optimization." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 462, no. 2071 (2006): 2077–84. http://dx.doi.org/10.1098/rspa.2006.1678.

Full text
Abstract:
While taboo search (TS), a method of global optimization, has successfully solved many optimization problems, little is known about its convergence properties, especially for continuous optimization tasks. We consider the global convergence of the original TS for solving continuous optimization problems, and give a condition which guarantees the convergence of the objective value sequence of the method. We also prove that the minimum objective value sequence converges to the vicinity of the global optimal value with probability 1.
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Changmin, Hai Yang, Daoli Zhu, and Qiang Meng. "A global optimization method for continuous network design problems." Transportation Research Part B: Methodological 46, no. 9 (2012): 1144–58. http://dx.doi.org/10.1016/j.trb.2012.05.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wu, Jinran, You-Gan Wang, Kevin Burrage, Yu-Chu Tian, Brodie Lawson, and Zhe Ding. "An improved firefly algorithm for global continuous optimization problems." Expert Systems with Applications 149 (July 2020): 113340. http://dx.doi.org/10.1016/j.eswa.2020.113340.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ritthipakdee, Amarita, Arit Thammano, Nol Premasathian, and Duangjai Jitkongchuen. "Firefly Mating Algorithm for Continuous Optimization Problems." Computational Intelligence and Neuroscience 2017 (2017): 1–10. http://dx.doi.org/10.1155/2017/8034573.

Full text
Abstract:
This paper proposes a swarm intelligence algorithm, called firefly mating algorithm (FMA), for solving continuous optimization problems. FMA uses genetic algorithm as the core of the algorithm. The main feature of the algorithm is a novel mating pair selection method which is inspired by the following 2 mating behaviors of fireflies in nature: (i) the mutual attraction between males and females causes them to mate and (ii) fireflies of both sexes are of the multiple-mating type, mating with multiple opposite sex partners. A female continues mating until her spermatheca becomes full, and, in the same vein, a male can provide sperms for several females until his sperm reservoir is depleted. This new feature enhances the global convergence capability of the algorithm. The performance of FMA was tested with 20 benchmark functions (sixteen 30-dimensional functions and four 2-dimensional ones) against FA, ALC-PSO, COA, MCPSO, LWGSODE, MPSODDS, DFOA, SHPSOS, LSA, MPDPGA, DE, and GABC algorithms. The experimental results showed that the success rates of our proposed algorithm with these functions were higher than those of other algorithms and the proposed algorithm also required fewer numbers of iterations to reach the global optima.
APA, Harvard, Vancouver, ISO, and other styles
7

Valian, Ehsan, Saeed Tavakoli, and Shahram Mohanna. "An intelligent global harmony search approach to continuous optimization problems." Applied Mathematics and Computation 232 (April 2014): 670–84. http://dx.doi.org/10.1016/j.amc.2014.01.086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pomrehn, L. P., and P. Y. Papalambros. "Global and Discrete Constraint Activity." Journal of Mechanical Design 116, no. 3 (1994): 745–48. http://dx.doi.org/10.1115/1.2919445.

Full text
Abstract:
The concept of constraint activity, widely used throughout the optimization literature, is extended and clarified to deal with global optimization problems containing either continuous or discrete variables. The article presents definitions applicable to individual constraints and discusses definitions for groups of constraints. Concepts are reinforced through the use of examples. The definitions are used to investigate the ideas of optimization “cases” and monotonicity analysis as applied to global and discrete problems. Relationships to local optimization are also noted.
APA, Harvard, Vancouver, ISO, and other styles
9

Wedyan, Ahmad, Jacqueline Whalley, and Ajit Narayanan. "Hydrological Cycle Algorithm for Continuous Optimization Problems." Journal of Optimization 2017 (2017): 1–25. http://dx.doi.org/10.1155/2017/3828420.

Full text
Abstract:
A new nature-inspired optimization algorithm called the Hydrological Cycle Algorithm (HCA) is proposed based on the continuous movement of water in nature. In the HCA, a collection of water drops passes through various hydrological water cycle stages, such as flow, evaporation, condensation, and precipitation. Each stage plays an important role in generating solutions and avoiding premature convergence. The HCA shares information by direct and indirect communication among the water drops, which improves solution quality. Similarities and differences between HCA and other water-based algorithms are identified, and the implications of these differences on overall performance are discussed. A new topological representation for problems with a continuous domain is proposed. In proof-of-concept experiments, the HCA is applied on a variety of benchmarked continuous numerical functions. The results were found to be competitive in comparison to a number of other algorithms and validate the effectiveness of HCA. Also demonstrated is the ability of HCA to escape from local optima solutions and converge to global solutions. Thus, HCA provides an alternative approach to tackling various types of multimodal continuous optimization problems as well as an overall framework for water-based particle algorithms in general.
APA, Harvard, Vancouver, ISO, and other styles
10

Chiu, Chui-Yu, Po-Chou Shih, and Xuechao Li. "A Dynamic Adjusting Novel Global Harmony Search for Continuous Optimization Problems." Symmetry 10, no. 8 (2018): 337. http://dx.doi.org/10.3390/sym10080337.

Full text
Abstract:
A novel global harmony search (NGHS) algorithm, as proposed in 2010, is an improved algorithm that combines the harmony search (HS), particle swarm optimization (PSO), and a genetic algorithm (GA). Moreover, the fixed parameter of mutation probability was used in the NGHS algorithm. However, appropriate parameters can enhance the searching ability of a metaheuristic algorithm, and their importance has been described in many studies. Inspired by the adjustment strategy of the improved harmony search (IHS) algorithm, a dynamic adjusting novel global harmony search (DANGHS) algorithm, which combines NGHS and dynamic adjustment strategies for genetic mutation probability, is introduced in this paper. Moreover, extensive computational experiments and comparisons are carried out for 14 benchmark continuous optimization problems. The results show that the proposed DANGHS algorithm has better performance in comparison with other HS algorithms in most problems. In addition, the proposed algorithm is more efficient than previous methods. Finally, different strategies are suitable for different situations. Among these strategies, the most interesting and exciting strategy is the periodic dynamic adjustment strategy. For a specific problem, the periodic dynamic adjustment strategy could have better performance in comparison with other decreasing or increasing strategies. These results inspire us to further investigate this kind of periodic dynamic adjustment strategy in future experiments.
APA, Harvard, Vancouver, ISO, and other styles
11

Protopopescu, V., and J. Barhen. "Solving a class of continuous global optimization problems using quantum algorithms." Physics Letters A 296, no. 1 (2002): 9–14. http://dx.doi.org/10.1016/s0375-9601(02)00187-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lara, Cristiana L., Francisco Trespalacios, and Ignacio E. Grossmann. "Global optimization algorithm for capacitated multi-facility continuous location-allocation problems." Journal of Global Optimization 71, no. 4 (2018): 871–89. http://dx.doi.org/10.1007/s10898-018-0621-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Lara, Pedro C. S., Renato Portugal, and Carlile Lavor. "A new hybrid classical-quantum algorithm for continuous global optimization problems." Journal of Global Optimization 60, no. 2 (2013): 317–31. http://dx.doi.org/10.1007/s10898-013-0112-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Ugon, J., S. Kouhbor, M. Mammadov, A. Rubinov, and A. Kruger. "Facility location via continuous optimization with discontinuous objective functions." ANZIAM Journal 48, no. 3 (2007): 315–25. http://dx.doi.org/10.1017/s1446181100003515.

Full text
Abstract:
AbstractFacility location problems are one of the most common applications of optimization methods. Continuous formulations are usually more accurate, but often result in complex problems that cannot be solved using traditional optimization methods. This paper examines theuse of a global optimization method—AGOP—for solving location problems where the objective function is discontinuous. This approach is motivated by a real-world application in wireless networks design.
APA, Harvard, Vancouver, ISO, and other styles
15

Arasomwan, Martins Akugbe, and Aderemi Oluyinka Adewumi. "Improved Particle Swarm Optimization with a Collective Local Unimodal Search for Continuous Optimization Problems." Scientific World Journal 2014 (2014): 1–23. http://dx.doi.org/10.1155/2014/798129.

Full text
Abstract:
A new local search technique is proposed and used to improve the performance of particle swarm optimization algorithms by addressing the problem of premature convergence. In the proposed local search technique, a potential particle position in the solution search space is collectively constructed by a number of randomly selected particles in the swarm. The number of times the selection is made varies with the dimension of the optimization problem and each selected particle donates the value in the location of its randomly selected dimension from its personal best. After constructing the potential particle position, some local search is done around its neighbourhood in comparison with the current swarm global best position. It is then used to replace the global best particle position if it is found to be better; otherwise no replacement is made. Using some well-studied benchmark problems with low and high dimensions, numerical simulations were used to validate the performance of the improved algorithms. Comparisons were made with four different PSO variants, two of the variants implement different local search technique while the other two do not. Results show that the improved algorithms could obtain better quality solution while demonstrating better convergence velocity and precision, stability, robustness, and global-local search ability than the competing variants.
APA, Harvard, Vancouver, ISO, and other styles
16

Koyuncu, Hasan, and Rahime Ceylan. "A PSO based approach: Scout particle swarm algorithm for continuous global optimization problems." Journal of Computational Design and Engineering 6, no. 2 (2018): 129–42. http://dx.doi.org/10.1016/j.jcde.2018.08.003.

Full text
Abstract:
Abstract In the literature, most studies focus on designing new methods inspired by biological processes, however hybridization of methods and hybridization way should be examined carefully to generate more suitable optimization methods. In this study, we handle Particle Swarm Optimization (PSO) and an efficient operator of Artificial Bee Colony Optimization (ABC) to design an efficient technique for continuous function optimization. In PSO, velocity and position concepts guide particles to achieve convergence. At this point, variable and stable parameters are ineffective for regenerating awkward particles that cannot improve their personal best position (Pbest). Thus, the need for external intervention is inevitable once a useful particle becomes an awkward one. In ABC, the scout bee phase acts as external intervention by sustaining the resurgence of incapable individuals. With the addition of a scout bee phase to standard PSO, Scout Particle Swarm Optimization (ScPSO) is formed which eliminates the most important handicap of PSO. Consequently, a robust optimization algorithm is obtained. ScPSO is tested on constrained optimization problems and optimum parameter values are obtained for the general use of ScPSO. To evaluate the performance, ScPSO is compared with Genetic Algorithm (GA), with variants of the PSO and ABC methods, and with hybrid approaches based on PSO and ABC algorithms on numerical function optimization. As seen in the results, ScPSO results in better optimal solutions than other approaches. In addition, its convergence is superior to a basic optimization method, to the variants of PSO and ABC algorithms, and to the hybrid approaches on different numerical benchmark functions. According to the results, the Total Statistical Success (TSS) value of ScPSO ranks first (5) in comparison with PSO variants; the second best TSS (2) belongs to CLPSO and SP-PSO techniques. In a comparison with ABC variants, the best TSS value (6) is obtained by ScPSO, while TSS of BitABC is 2. In comparison with hybrid techniques, ScPSO obtains the best Total Average Rank (TAR) as 1.375, and TSS of ScPSO ranks first (6) again. The fitness values obtained by ScPSO are generally more satisfactory than the values obtained by other methods. Consequently, ScPSO achieve promising gains over other optimization methods; in parallel with this result, its usage can be extended to different working disciplines. Highlights PSO parameters are ineffective to regenerate the awkward particle that cannot improve its pbest. An external intervention is inevitable once a particle becomes an awkward one. ScPSO is obtained with the addition of scout bee phase into the PSO. So an evolutionary method eliminating the most important handicap of PSO is gained. ScPSO is compared with the variants and with hybrid versions of PSO and ABC methods. According to the experiments, ScPSO results in better optimal solutions. The fitness values of ScPSO are generally more satisfactory than the others. Consequently, ScPSO achieve promising gains over other optimization methods. In parallel with this, its usage can be extended to different working disciplines.
APA, Harvard, Vancouver, ISO, and other styles
17

Shola, Peter Bamidele, and L. B. Asaju. "An Algorithm for Continuous Optimization Problems using Hybrid Particle Updating Method." Indonesian Journal of Electrical Engineering and Computer Science 3, no. 1 (2016): 164. http://dx.doi.org/10.11591/ijeecs.v3.i1.pp164-173.

Full text
Abstract:
<p>Optimization problem is one such problem commonly encountered in many area of endeavor, obviously due to the need to economize the use of the available resources in many problems. This paper presents a population-based meta-heuristic algorithm for solving optimization problems in a continous space. The algorithm, combines a form of cross-over technique with a position updating formula based on the instantaneous global best position to update each particle position .The algorithm was tested and compared with the standard particle swarm optimization (PSO) on many benchmark functions. The result suggests a better performance of the algorithm over the later in terms of reaching (attaining) the global optimum value (at least for those benchmark functions considered) and the rate of convergence in terms of the number of iterations required reaching the optimum values.</p>
APA, Harvard, Vancouver, ISO, and other styles
18

Brahimi, Nassim, Abdellah Salhi, and Megdouda Ourbih-Tari. "Convergence analysis of the plant propagation algorithm for continuous global optimization." RAIRO - Operations Research 52, no. 2 (2018): 429–38. http://dx.doi.org/10.1051/ro/2017037.

Full text
Abstract:
The Plant Propagation Algorithm (PPA) is a Nature-Inspired stochastic algorithm, which emulates the way plants, in particular the strawberry plant, propagate using runners. It has been experimentally tested both on unconstrained and constrained continuous global optimization problems and was found to be competitive against well established algorithms. This paper is concerned with its convergence analysis. It first puts forward a general convergence theorem for a large class of random algorithms, before the PPA convergence theorem is derived and proved. It then illustrates the results on simple problems.
APA, Harvard, Vancouver, ISO, and other styles
19

Balasundaram, Balabhaskar, and Sergiy Butenko. "Constructing test functions for global optimization using continuous formulations of graph problems." Optimization Methods and Software 20, no. 4-5 (2005): 439–52. http://dx.doi.org/10.1080/10556780500139641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pan, Quan-Ke, P. N. Suganthan, M. Fatih Tasgetiren, and J. J. Liang. "A self-adaptive global best harmony search algorithm for continuous optimization problems." Applied Mathematics and Computation 216, no. 3 (2010): 830–48. http://dx.doi.org/10.1016/j.amc.2010.01.088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ting, T. O., H. C. Ting, and T. S. Lee. "Taguchi-Particle Swarm Optimization for Numerical Optimization." International Journal of Swarm Intelligence Research 1, no. 2 (2010): 18–33. http://dx.doi.org/10.4018/jsir.2010040102.

Full text
Abstract:
In this work, a hybrid Taguchi-Particle Swarm Optimization (TPSO) is proposed to solve global numerical optimization problems with continuous and discrete variables. This hybrid algorithm combines the well-known Particle Swarm Optimization Algorithm with the established Taguchi method, which has been an important tool for robust design. This paper presents the improvements obtained despite the simplicity of the hybridization process. The Taguchi method is run only once in every PSO iteration and therefore does not give significant impact in terms of computational cost. The method creates a more diversified population, which also contributes to the success of avoiding premature convergence. The proposed method is effectively applied to solve 13 benchmark problems. This study’s results show drastic improvements in comparison with the standard PSO algorithm involving continuous and discrete variables on high dimensional benchmark functions.
APA, Harvard, Vancouver, ISO, and other styles
22

Tsai, Jung-Fa, Ming-Hua Lin, and Duan-Yi Wen. "Global Optimization for Mixed–Discrete Structural Design." Symmetry 12, no. 9 (2020): 1529. http://dx.doi.org/10.3390/sym12091529.

Full text
Abstract:
Several structural design problems that involve continuous and discrete variables are very challenging because of the combinatorial and non-convex characteristics of the problems. Although the deterministic optimization approach theoretically guarantees to find the global optimum, it usually leads to a significant burden in computational time. This article studies the deterministic approach for globally solving mixed–discrete structural optimization problems. An improved method that symmetrically reduces the number of constraints for linearly expressing signomial terms with pure discrete variables is applied to significantly enhance the computational efficiency of obtaining the exact global optimum of the mixed–discrete structural design problem. Numerical experiments of solving the stepped cantilever beam design problem and the pressure vessel design problem are conducted to show the efficiency and effectiveness of the presented approach. Compared with existing methods, this study introduces fewer convex terms and constraints for transforming the mixed–discrete structural problem and uses much less computational time for solving the reformulated problem to global optimality.
APA, Harvard, Vancouver, ISO, and other styles
23

Li, Kun, and Huixin Tian. "A DE-Based Scatter Search for Global Optimization Problems." Discrete Dynamics in Nature and Society 2015 (2015): 1–9. http://dx.doi.org/10.1155/2015/303125.

Full text
Abstract:
This paper proposes a hybrid scatter search (SS) algorithm for continuous global optimization problems by incorporating the evolution mechanism of differential evolution (DE) into the reference set updated procedure of SS to act as the new solution generation method. This hybrid algorithm is called a DE-based SS (SSDE) algorithm. Since different kinds of mutation operators of DE have been proposed in the literature and they have shown different search abilities for different kinds of problems, four traditional mutation operators are adopted in the hybrid SSDE algorithm. To adaptively select the mutation operator that is most appropriate to the current problem, an adaptive mechanism for the candidate mutation operators is developed. In addition, to enhance the exploration ability of SSDE, a reinitialization method is adopted to create a new population and subsequently construct a new reference set whenever the search process of SSDE is trapped in local optimum. Computational experiments on benchmark problems show that the proposed SSDE is competitive or superior to some state-of-the-art algorithms in the literature.
APA, Harvard, Vancouver, ISO, and other styles
24

Huang, Tian Shun, Xiao Qiang Li, Hong Yun Lian, and Zhi Qiang Zhang. "Optimization Technology of PID Parameter in Control System Based on Improved Particle Swarm Optimization Algorithms." Advanced Materials Research 908 (March 2014): 547–50. http://dx.doi.org/10.4028/www.scientific.net/amr.908.547.

Full text
Abstract:
Particle swarm algorithm has been proven to be very good solving many global optimization problems. Firstly we improved particle swarm optimization algorithm, the improved PSO algorithm for continuous optimization problem, in solving the nonlinear combinatorial optimization problems and mixed integer nonlinear optimization problems is very effective. This design adopts the improved particle swarm algorithm to optimize the PID parameters of the control system, and the effectiveness of the improved algorithm is proved by experiment.
APA, Harvard, Vancouver, ISO, and other styles
25

Cao, Zijian, and Lei Wang. "An Optimization Algorithm Inspired by the Phase Transition Phenomenon for Global Optimization Problems with Continuous Variables." Algorithms 10, no. 4 (2017): 119. http://dx.doi.org/10.3390/a10040119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Papadimitriou, Costas, and Evaggelos Ntotsios. "Optimization Algorithms for System Integration." Advances in Science and Technology 56 (September 2008): 514–23. http://dx.doi.org/10.4028/www.scientific.net/ast.56.514.

Full text
Abstract:
This work outlines the optimization algorithms involved in integrating system analysis and measured data collected from a network of sensors. The integration is required for structural health monitoring problems arising in structural dynamics and related to (1) model parameter estimation used for finite element model updating, (2) model-based damage detection in structures and (3) optimal sensor location for parameter estimation and damage detection. These problems are formulated as single- and multi-objective optimization problems of continuous or discrete-valued variables. Gradient-based, evolutionary, hybrid and heuristic algorithms are presented that effectively address issues related to the estimation of multiple local/global solutions and computational complexity arising in single and multi-objective optimization involving continuous and discrete variables.
APA, Harvard, Vancouver, ISO, and other styles
27

Romeijn, H. Edwin, and Robert L. Smith. "Simulated Annealing and Adaptive Search in Global Optimization." Probability in the Engineering and Informational Sciences 8, no. 4 (1994): 571–90. http://dx.doi.org/10.1017/s0269964800003624.

Full text
Abstract:
Simulated annealing is a class of sequential search techniques for solving continuous global optimization problems. In this paper we attempt to help explain the success of simulated annealing for this class of problems by studying an idealized version of this algorithm, which we call adaptive search. The prototypical adaptive search algorithm generates a sequence of improving points drawn conditionally from samples from a corresponding sequence of probability distributions. Under the condition that the sequence of distributions stochastically dominate in objective function value the uniform distribution, we show that the expected number of improving points required to achieve the global optimum within a prespecified error grows at most linearly in the dimension of the problem for a large class of global optimization problems. Moreover, we derive a cooling schedule for simulated annealing, which follows in a natural way from the definition of the adaptive search algorithm.
APA, Harvard, Vancouver, ISO, and other styles
28

Cao, Yongcun, Yong Lu, Xiuqin Pan, and Na Sun. "An improved global best guided artificial bee colony algorithm for continuous optimization problems." Cluster Computing 22, S2 (2018): 3011–19. http://dx.doi.org/10.1007/s10586-018-1817-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Parpas, P., and B. Rustem. "An Algorithm for the Global Optimization of a Class of Continuous Minimax Problems." Journal of Optimization Theory and Applications 141, no. 2 (2008): 461–73. http://dx.doi.org/10.1007/s10957-008-9473-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

LEE, CHA KUN, and PAUL I. BARTON. "GLOBAL OPTIMIZATION OF LINEAR HYBRID SYSTEMS WITH VARYING TIME EVENTS." International Journal of Software Engineering and Knowledge Engineering 15, no. 02 (2005): 467–72. http://dx.doi.org/10.1142/s0218194005001938.

Full text
Abstract:
Dynamic optimization problems with linear hybrid (discrete/continuous) systems embedded whose transition times vary are inherently nonconvex. For a wide variety of applications, a certificate of global optimality is essential, but this cannot be obtained using conventional numerical methods. We present a deterministic framework for the solution of such problems in the continuous time domain. First, the control parametrization enhancing transform is used to transform the embedded dynamic system from a linear hybrid system with scaled discontinuities and varying transition times into a nonlinear hybrid system with stationary discontinuities and fixed transition times. Next, a recently developed convexity theory is applied to construct a convex relaxation of the original nonconvex problem. This allows the problem to be solved in a branch-and-bound framework that can guarantee the global solution within epsilon optimality in a finite number of iterations.
APA, Harvard, Vancouver, ISO, and other styles
31

Hezam, Ibrahim M., Osama Abdel Raouf, and Mohey M. Hadhoud. "A New Compound Swarm Intelligence Algorithms for Solving Global Optimization Problems." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 10, no. 9 (2013): 2010–20. http://dx.doi.org/10.24297/ijct.v10i9.1389.

Full text
Abstract:
This paper proposes a new hybrid swarm intelligence algorithm that encompasses the feature of three major swarm algorithms. It combines the fast convergence of the Cuckoo Search (CS), the dynamic root change of the Firefly Algorithm (FA), and the continuous position update of the Particle Swarm Optimization (PSO). The Compound Swarm Intelligence Algorithm (CSIA) will be used to solve a set of standard benchmark functions. The research study compares the performance of CSIA with that of CS, FA, and PSO, using the same set of benchmark functions. The comparison aims to test if the performance of CSIA is Competitive to that of the CS, FA, and PSO algorithms denoting the solution results of the benchmark functions.
APA, Harvard, Vancouver, ISO, and other styles
32

Wang, Ning, and Shi You Yang. "An Improved Tabu Search Algorithm for Global Optimization of Engineering Design Problems." Applied Mechanics and Materials 441 (December 2013): 762–67. http://dx.doi.org/10.4028/www.scientific.net/amm.441.762.

Full text
Abstract:
To find the global optimal solution of a multimodal function with both continuous and discrete variables, an improved tabu search algorithm is proposed. The improvements include new generating mechanisms for initial and neighborhood solutions, the exclusive use of the tabu list, the restarting methodology for different cycle of iterations as well as the shifting away from the worst solutions. The numerical results on two numerical examples are reported to demonstrate the feasibility and merit of the proposed algorithm.
APA, Harvard, Vancouver, ISO, and other styles
33

WAH, BENJAMIN W., and TAO WANG. "TUNING STRATEGIES IN CONSTRAINED SIMULATED ANNEALING FOR NONLINEAR GLOBAL OPTIMIZATION." International Journal on Artificial Intelligence Tools 09, no. 01 (2000): 3–25. http://dx.doi.org/10.1142/s0218213000000033.

Full text
Abstract:
This paper studies various strategies in constrained simulated annealing (CSA), a global optimization algorithm that achieves asymptotic convergence to constrained global minima (CGM) with probability one for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for discrete constrained local minima (CLM) in the theory of discrete Lagrange multipliers and its extensions to continuous and mixed-integer constrained NLPs. The strategies studied include adaptive neighborhoods, distributions to control sampling, acceptance probabilities, and cooling schedules. We report much better solutions than the best-known solutions in the literature on two sets of continuous benchmarks and their discretized versions.
APA, Harvard, Vancouver, ISO, and other styles
34

Ali, M. Montaz, Charoenchai Khompatraporn, and Zelda B. Zabinsky. "A Numerical Evaluation of Several Stochastic Algorithms on Selected Continuous Global Optimization Test Problems." Journal of Global Optimization 31, no. 4 (2005): 635–72. http://dx.doi.org/10.1007/s10898-004-9972-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

de-los-Cobos-Silva, Sergio Gerardo, Miguel Ángel Gutiérrez-Andrade, Roman Anselmo Mora-Gutiérrez, Pedro Lara-Velázquez, Eric Alfredo Rincón-García, and Antonin Ponsich. "An Efficient Algorithm for Unconstrained Optimization." Mathematical Problems in Engineering 2015 (2015): 1–17. http://dx.doi.org/10.1155/2015/178545.

Full text
Abstract:
This paper presents an original and efficient PSO algorithm, which is divided into three phases: (1) stabilization, (2) breadth-first search, and (3) depth-first search. The proposed algorithm, called PSO-3P, was tested with 47 benchmark continuous unconstrained optimization problems, on a total of 82 instances. The numerical results show that the proposed algorithm is able to reach the global optimum. This work mainly focuses on unconstrained optimization problems from 2 to 1,000 variables.
APA, Harvard, Vancouver, ISO, and other styles
36

Aungkulanon, Pasura. "A Comparative Study of Global-Best Harmony Search and Bat Algorithms on Optimization Problems." Applied Mechanics and Materials 464 (November 2013): 352–57. http://dx.doi.org/10.4028/www.scientific.net/amm.464.352.

Full text
Abstract:
The engineering optimization problems are large and complex. Effective methods for solving these problems using a finite sequence of instructions can be categorized into optimization and meta-heuristics algorithms. Meta-heuristics techniques have been proved to solve various real world problems. In this study, a comparison of two meta-heuristic techniques, namely, Global-Best Harmony Search algorithm (GHSA) and Bat algorithm (BATA), for solving constrained optimization problems was carried out. GHSA and BATA are optimization algorithms inspired by the structure of harmony improvisation search process and social behavior of bat echolocation for decision direction. These algorithms were implemented under different natures of three optimization, which are single-peak, multi-peak and curved-ridge response surfaces. Moreover, both algorithms were also applied to constrained engineering problems. The results from non-linear continuous unconstrained functions in the context of response surface methodology and constrained problems can be shown that Bat algorithm seems to be better in terms of the sample mean and variance of design points yields and computation time.
APA, Harvard, Vancouver, ISO, and other styles
37

Alomoush, Alaa A., Abdul Rahman A. Alsewari, Kamal Z. Zamli, Ayat Alrosan, Waleed Alomoush, and Khalid Alissa. "Enhancing three variants of harmony search algorithm for continuous optimization problems." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 3 (2021): 2343. http://dx.doi.org/10.11591/ijece.v11i3.pp2343-2349.

Full text
Abstract:
Meta-heuristic algorithms are well-known optimization methods, for solving real-world optimization problems. Harmony search (HS) is a recognized meta-heuristic algorithm with an efficient exploration process. But the HS has a slow convergence rate, which causes the algorithm to have a weak exploitation process in finding the global optima. Different variants of HS introduced in the literature to enhance the algorithm and fix its problems, but in most cases, the algorithm still has a slow convergence rate. Meanwhile, opposition-based learning (OBL), is an effective technique used to improve the performance of different optimization algorithms, including HS. In this work, we adopted a new improved version of OBL, to improve three variants of Harmony Search, by increasing the convergence rate speed of these variants and improving overall performance. The new OBL version named improved opposition-based learning (IOBL), and it is different from the original OBL by adopting randomness to increase the solution's diversity. To evaluate the hybrid algorithms, we run it on benchmark functions to compare the obtained results with its original versions. The obtained results show that the new hybrid algorithms more efficient compared to the original versions of HS. A convergence rate graph is also used to show the overall performance of the new algorithms.
APA, Harvard, Vancouver, ISO, and other styles
38

Kleinert, Thomas, Veronika Grimm, and Martin Schmidt. "Outer approximation for global optimization of mixed-integer quadratic bilevel problems." Mathematical Programming 188, no. 2 (2021): 461–521. http://dx.doi.org/10.1007/s10107-020-01601-2.

Full text
Abstract:
AbstractBilevel optimization problems have received a lot of attention in the last years and decades. Besides numerous theoretical developments there also evolved novel solution algorithms for mixed-integer linear bilevel problems and the most recent algorithms use branch-and-cut techniques from mixed-integer programming that are especially tailored for the bilevel context. In this paper, we consider MIQP-QP bilevel problems, i.e., models with a mixed-integer convex-quadratic upper level and a continuous convex-quadratic lower level. This setting allows for a strong-duality-based transformation of the lower level which yields, in general, an equivalent nonconvex single-level reformulation of the original bilevel problem. Under reasonable assumptions, we can derive both a multi- and a single-tree outer-approximation-based cutting-plane algorithm. We show finite termination and correctness of both methods and present extensive numerical results that illustrate the applicability of the approaches. It turns out that the proposed methods are capable of solving bilevel instances with several thousand variables and constraints and significantly outperform classical solution approaches.
APA, Harvard, Vancouver, ISO, and other styles
39

Tsai, Jinn-Tsong, Jyh-Horng Chou, and Wen-Hsien Ho. "Improved Quantum-Inspired Evolutionary Algorithm for Engineering Design Optimization." Mathematical Problems in Engineering 2012 (2012): 1–27. http://dx.doi.org/10.1155/2012/836597.

Full text
Abstract:
An improved quantum-inspired evolutionary algorithm is proposed for solving mixed discrete-continuous nonlinear problems in engineering design. The proposed Latin square quantum-inspired evolutionary algorithm (LSQEA) combines Latin squares and quantum-inspired genetic algorithm (QGA). The novel contribution of the proposed LSQEA is the use of a QGA to explore the optimal feasible region in macrospace and the use of a systematic reasoning mechanism of the Latin square to exploit the better solution in microspace. By combining the advantages of exploration and exploitation, the LSQEA provides higher computational efficiency and robustness compared to QGA and real-coded GA when solving global numerical optimization problems with continuous variables. Additionally, the proposed LSQEA approach effectively solves mixed discrete-continuous nonlinear design optimization problems in which the design variables are integers, discrete values, and continuous values. The computational experiments show that the proposed LSQEA approach obtains better results compared to existing methods reported in the literature.
APA, Harvard, Vancouver, ISO, and other styles
40

Pushkaryov, Kirill Vladimirovich. "Global optimization via neural network approximation of inverse coordinate mappings with evolutionary parameter control." Program Systems: Theory and Applications 10, no. 2 (2019): 3–31. http://dx.doi.org/10.25209/2079-3316-2019-10-2-3-31.

Full text
Abstract:
A hybrid method of global optimization NNAICM-PSO is presented. It uses neural network approximation of inverse mappings of objective function values to coordinates combined with particle swarm optimization to find the global minimum of a continuous objective function of multiple variables with bound constraints. The objective function is viewed as a black box. The method employs groups of moving probe points attracted by goals like in particle swarm optimization. One of the possible goals is determined via mapping of decreased objective function values to coordinates by modified Dual Generalized Regression Neural Networks constructed from probe points. The parameters of the search are controlled by an evolutionary algorithm. The algorithm forms a population of evolving rules each containing a tuple of parameter values. There are two measures of fitness: short-term (charm) and long-term (merit). Charm is used to select rules for reproduction and application. Merit determines survival of an individual. This two-fold system preserves potentially useful individuals from extinction due to short-term situation changes. Test problems of 100 variables were solved. The results indicate that evolutionary control is better than random variation of parameters for NNAICM-PSO. With some problems, when rule bases are reused, error progressively decreases in subsequent runs, which means that the method adapts to the problem.
APA, Harvard, Vancouver, ISO, and other styles
41

Rahmani, Rasoul, and Rubiyah Yusof. "A new simple, fast and efficient algorithm for global optimization over continuous search-space problems: Radial Movement Optimization." Applied Mathematics and Computation 248 (December 2014): 287–300. http://dx.doi.org/10.1016/j.amc.2014.09.102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Abbas, Qamar, Jamil Ahmad, and Hajira Jabeen. "A Novel Tournament Selection Based Differential Evolution Variant for Continuous Optimization Problems." Mathematical Problems in Engineering 2015 (2015): 1–21. http://dx.doi.org/10.1155/2015/205709.

Full text
Abstract:
Differential evolution (DE) is a powerful global optimization algorithm which has been studied intensively by many researchers in the recent years. A number of variants have been established for the algorithm that makes DE more applicable. However, most of the variants are suffering from the problems of convergence speed and local optima. A novel tournament based parent selection variant of DE algorithm is proposed in this research. The proposed variant enhances searching capability and improves convergence speed of DE algorithm. This paper also presents a novel statistical comparison of existing DE mutation variants which categorizes these variants in terms of their overall performance. Experimental results show that the proposed DE variant has significance performance over other DE mutation variants.
APA, Harvard, Vancouver, ISO, and other styles
43

ALI, MUSRRAT, MILLIE PANT, AJITH ABRAHAM, and CHANG WOOK AHN. "SWARM DIRECTIONS EMBEDDED DIFFERENTIAL EVOLUTION FOR FASTER CONVERGENCE OF GLOBAL OPTIMIZATION PROBLEMS." International Journal on Artificial Intelligence Tools 21, no. 03 (2012): 1240013. http://dx.doi.org/10.1142/s0218213012400131.

Full text
Abstract:
In the present study we propose a new hybrid version of Differential Evolution (DE) and Particle Swarm Optimization (PSO) algorithms called Hybrid DE or HDE for solving continuous global optimization problems. In the proposed HDE algorithm, information sharing mechanism of PSO is embedded in the contracted search space obtained by the basic DE algorithm. This is done to maintain a balance between the two antagonist factors; exploration and exploitation thereby obtaining a faster convergence. The embedding of swarm directions to the basic DE algorithm is done with the help of a "switchover constant" called α which keeps a record of the contraction of search space. The proposed HDE algorithm is tested on a set of 10 unconstrained benchmark problems and four constrained real life, mechanical design problems. Empirical studies show that the proposed scheme helps in improving the convergence rate of the basic DE algorithm without compromising with the quality of solution.
APA, Harvard, Vancouver, ISO, and other styles
44

Kanzow, Christian, Andreas B. Raharja, and Alexandra Schwartz. "An Augmented Lagrangian Method for Cardinality-Constrained Optimization Problems." Journal of Optimization Theory and Applications 189, no. 3 (2021): 793–813. http://dx.doi.org/10.1007/s10957-021-01854-7.

Full text
Abstract:
AbstractA reformulation of cardinality-constrained optimization problems into continuous nonlinear optimization problems with an orthogonality-type constraint has gained some popularity during the last few years. Due to the special structure of the constraints, the reformulation violates many standard assumptions and therefore is often solved using specialized algorithms. In contrast to this, we investigate the viability of using a standard safeguarded multiplier penalty method without any problem-tailored modifications to solve the reformulated problem. We prove global convergence towards an (essentially strongly) stationary point under a suitable problem-tailored quasinormality constraint qualification. Numerical experiments illustrating the performance of the method in comparison to regularization-based approaches are provided.
APA, Harvard, Vancouver, ISO, and other styles
45

Wu, Zong-Sheng, Wei-Ping Fu, and Ru Xue. "Nonlinear Inertia Weighted Teaching-Learning-Based Optimization for Solving Global Optimization Problem." Computational Intelligence and Neuroscience 2015 (2015): 1–15. http://dx.doi.org/10.1155/2015/292576.

Full text
Abstract:
Teaching-learning-based optimization (TLBO) algorithm is proposed in recent years that simulates the teaching-learning phenomenon of a classroom to effectively solve global optimization of multidimensional, linear, and nonlinear problems over continuous spaces. In this paper, an improved teaching-learning-based optimization algorithm is presented, which is called nonlinear inertia weighted teaching-learning-based optimization (NIWTLBO) algorithm. This algorithm introduces a nonlinear inertia weighted factor into the basic TLBO to control the memory rate of learners and uses a dynamic inertia weighted factor to replace the original random number in teacher phase and learner phase. The proposed algorithm is tested on a number of benchmark functions, and its performance comparisons are provided against the basic TLBO and some other well-known optimization algorithms. The experiment results show that the proposed algorithm has a faster convergence rate and better performance than the basic TLBO and some other algorithms as well.
APA, Harvard, Vancouver, ISO, and other styles
46

Lohokare, M. R., S. S. Pattnaik, S. Devi, B. K. Panigrahi, S. Das, and J. G. Joshi. "Extrapolated Biogeography-Based Optimization (eBBO) for Global Numerical Optimization and Microstrip Patch Antenna Design." International Journal of Applied Evolutionary Computation 1, no. 3 (2010): 1–26. http://dx.doi.org/10.4018/jaec.2010070101.

Full text
Abstract:
Biogeography-Based Optimization (BBO) uses the idea of probabilistically sharing features between solutions based on the solutions’ fitness values. Therefore, its exploitation ability is good but it lacks in exploration ability. In this paper, the authors extend the original BBO and propose a hybrid version combined with ePSO (particle swarm optimization with extrapolation technique), namely eBBO, for unconstrained global numerical optimization problems in the continuous domain. eBBO combines the exploitation ability of BBO with the exploration ability of ePSO effectively, which can generate global optimum solutions. To validate the performance of eBBO, experiments have been conducted on 23 standard benchmark problems with a range of dimensions and diverse complexities and compared with original BBO and other versions of BBO in terms of the quality of the final solution and the convergence rate. Influence of population size and scalability study is also considered and results are compared with statistical paired t-test. Experimental analysis indicates that the proposed approach is effective and efficient and improves the exploration ability of BBO.
APA, Harvard, Vancouver, ISO, and other styles
47

Kucharzak, Michał, Dawid Zydek, and Iwona Poźniak-Koszałka. "Overlay Multicast Optimization: IBM ILOG CPLEX." International Journal of Electronics and Telecommunications 58, no. 4 (2012): 381–88. http://dx.doi.org/10.2478/v10177-012-0052-0.

Full text
Abstract:
Abstract IBM ILOG CPLEX Optimization Studio delivers advanced and complex optimization libraries that solve linear programming (LP) and related problems, e.g., mixed integer. Moreover, the optimization tool provides users with its Academic Research Edition, which is available for teaching and noncommercial research at no-charge. This paper describes the usage of CPLEX C++ API for solving linear problems and, as an exhaustive example, optimization of network flows in overlay multicast is taken into account. Applying continuous and integral variables and implementing various constraints, including equations and inequalities, as well as setting some global parameters of the solver are presented and widely explained.
APA, Harvard, Vancouver, ISO, and other styles
48

Ma, Lianbo, Kunyuan Hu, Yunlong Zhu, Ben Niu, Hanning Chen, and Maowei He. "Discrete and Continuous Optimization Based on Hierarchical Artificial Bee Colony Optimizer." Journal of Applied Mathematics 2014 (2014): 1–20. http://dx.doi.org/10.1155/2014/402616.

Full text
Abstract:
This paper presents a novel optimization algorithm, namely, hierarchical artificial bee colony optimization (HABC), to tackle complex high-dimensional problems. In the proposed multilevel model, the higher-level species can be aggregated by the subpopulations from lower level. In the bottom level, each subpopulation employing the canonical ABC method searches the part-dimensional optimum in parallel, which can be constructed into a complete solution for the upper level. At the same time, the comprehensive learning method with crossover and mutation operator is applied to enhance the global search ability between species. Experiments are conducted on a set of 20 continuous and discrete benchmark problems. The experimental results demonstrate remarkable performance of the HABC algorithm when compared with other six evolutionary algorithms.
APA, Harvard, Vancouver, ISO, and other styles
49

Ramadan, Saleem Z. "A Hybrid Global Optimization Method Based on Genetic Algorithm and Shrinking Box." Modern Applied Science 10, no. 2 (2016): 67. http://dx.doi.org/10.5539/mas.v10n2p67.

Full text
Abstract:
<p class="zhengwen">This paper proposes a hybrid genetic algorithm method for optimizing constrained black box functions utilizing shrinking box and exterior penalty function methods (SBPGA). The constraints of the problem were incorporated in the fitness function of the genetic algorithm through the penalty function. The hybrid method used the proposed Variance-based crossover (VBC) and Arithmetic-based mutation (ABM) operators; moreover, immigration operator was also used. The box constraints constituted a hyperrectangle that kept shrinking adaptively in the light of the revealed information from the genetic algorithm about the optimal solution. The performance of the proposed algorithm was assessed using 11 problems which are used as benchmark problems in constrained optimization literatures. ANOVA along with a success rate performance index were used to analyze the model.</p>Based on the results, we believe that the proposed method is fairly robust and efficient global optimization method for Constrained Optimization Problems whether they are continuous or discrete.
APA, Harvard, Vancouver, ISO, and other styles
50

Jiao, Yong-Chang, Chuangyin Dang, Yee Leung, and Yue Hao. "A Modification to the New Version of the Price’s Algorithm for Continuous Global Optimization Problems." Journal of Global Optimization 36, no. 4 (2006): 609–26. http://dx.doi.org/10.1007/s10898-006-9030-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!