Добірка наукової літератури з теми "Optimization algorithms"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Optimization algorithms".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Optimization algorithms"

1

Celik, Yuksel, and Erkan Ulker. "An Improved Marriage in Honey Bees Optimization Algorithm for Single Objective Unconstrained Optimization." Scientific World Journal 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/370172.

Повний текст джерела
Анотація:
Marriage in honey bees optimization (MBO) is a metaheuristic optimization algorithm developed by inspiration of the mating and fertilization process of honey bees and is a kind of swarm intelligence optimizations. In this study we propose improved marriage in honey bees optimization (IMBO) by adding Levy flight algorithm for queen mating flight and neighboring for worker drone improving. The IMBO algorithm’s performance and its success are tested on the well-known six unconstrained test functions and compared with other metaheuristic optimization algorithms.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Luan, Yuxuan, Junjiang He, Jingmin Yang, Xiaolong Lan, and Geying Yang. "Uniformity-Comprehensive Multiobjective Optimization Evolutionary Algorithm Based on Machine Learning." International Journal of Intelligent Systems 2023 (November 10, 2023): 1–21. http://dx.doi.org/10.1155/2023/1666735.

Повний текст джерела
Анотація:
When solving real-world optimization problems, the uniformity of Pareto fronts is an essential strategy in multiobjective optimization problems (MOPs). However, it is a common challenge for many existing multiobjective optimization algorithms due to the skewed distribution of solutions and biases towards specific objective functions. This paper proposes a uniformity-comprehensive multiobjective optimization evolutionary algorithm based on machine learning to address this limitation. Our algorithm utilizes uniform initialization and self-organizing map (SOM) to enhance population diversity and
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Wen, Xiaodong, Xiangdong Liu, Cunhui Yu, Haoning Gao, Jing Wang, Yongji Liang, Jiangli Yu, and Yan Bai. "IOOA: A multi-strategy fusion improved Osprey Optimization Algorithm for global optimization." Electronic Research Archive 32, no. 3 (2024): 2033–74. http://dx.doi.org/10.3934/era.2024093.

Повний текст джерела
Анотація:
<abstract><p>With the widespread application of metaheuristic algorithms in engineering and scientific research, finding algorithms with efficient global search capabilities and precise local search performance has become a hot topic in research. The osprey optimization algorithm (OOA) was first proposed in 2023, characterized by its simple structure and strong optimization capability. However, practical tests have revealed that the OOA algorithm inevitably encounters common issues faced by metaheuristic algorithms, such as the tendency to fall into local optima and reduced populat
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Jay, Kishore Sahani, and Kumar Yadav Arvind. "The Bees Algorithms in Optimization: An Overview." MATHEMATICS EDUCATION LV, no. 3, September 2021 (September 30, 2021): 20–28. https://doi.org/10.5281/zenodo.7275730.

Повний текст джерела
Анотація:
            Metaheuristic algorithms have become powerful tools for modeling and optimization. In this article, we provide an overview of Bee Algorithms and their applications. We will briefly introduce algorithms such as bee algorithms, virtual bee algorithm, artificial bee algorithm, bee mating algorithm, etc. We also briefly the main characteristics of these algorithms and outline some recent applications of these algorithms. 
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Priyadarshini, Ishaani. "Dendritic Growth Optimization: A Novel Nature-Inspired Algorithm for Real-World Optimization Problems." Biomimetics 9, no. 3 (February 21, 2024): 130. http://dx.doi.org/10.3390/biomimetics9030130.

Повний текст джерела
Анотація:
In numerous scientific disciplines and practical applications, addressing optimization challenges is a common imperative. Nature-inspired optimization algorithms represent a highly valuable and pragmatic approach to tackling these complexities. This paper introduces Dendritic Growth Optimization (DGO), a novel algorithm inspired by natural branching patterns. DGO offers a novel solution for intricate optimization problems and demonstrates its efficiency in exploring diverse solution spaces. The algorithm has been extensively tested with a suite of machine learning algorithms, deep learning alg
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Kim, Minsu, Areum Han, Jaewon Lee, Sunghyun Cho, Il Moon, and Jonggeol Na. "Comparison of Derivative-Free Optimization: Energy Optimization of Steam Methane Reforming Process." International Journal of Energy Research 2023 (June 3, 2023): 1–20. http://dx.doi.org/10.1155/2023/8868540.

Повний текст джерела
Анотація:
In modern chemical engineering, various derivative-free optimization (DFO) studies have been conducted to identify operating conditions that maximize energy efficiency for efficient operation of processes. Although DFO algorithm selection is an essential task that leads to successful designs, it is a nonintuitive task because of the uncertain performance of the algorithms. In particular, when the system evaluation cost or computational load is high (e.g., density functional theory and computational fluid dynamics), selecting an algorithm that quickly converges to the near-global optimum at the
Стилі APA, Harvard, Vancouver, ISO та ін.
7

A., Hanif Halim, and Ismail I. "Tree Physiology Optimization in Constrained Optimization Problem." TELKOMNIKA Telecommunication, Computing, Electronics and Control 16, no. 2 (April 1, 2018): 876–82. https://doi.org/10.12928/TELKOMNIKA.v16i2.9021.

Повний текст джерела
Анотація:
Metaheuristic algorithms are proven to be more effective on finding global optimum in numerous problems including the constrained optimization area. The algorithms have the capacity to prevail over many deficiencies in conventional algorithms. Besides of good quality of performance, some metaheuristic algorithms have limitations that may deteriorate by certain degree of difficulties especially in real-world application. Most of the real-world problems consist of constrained problem that is significantly important in modern engineering design and must be considered in order to perform any optim
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Gireesha. B, Mr, and . "A Literature Survey on Artificial Swarm Intelligence based Optimization Techniques." International Journal of Engineering & Technology 7, no. 4.5 (September 22, 2018): 455. http://dx.doi.org/10.14419/ijet.v7i4.5.20205.

Повний текст джерела
Анотація:
From few decades’ optimizations techniques plays a key role in engineering and technological field applications. They are known for their behaviour pattern for solving modern engineering problems. Among various optimization techniques, heuristic and meta-heuristic algorithms proved to be efficient. In this paper, an effort is made to address techniques that are commonly used in engineering applications. This paper presents a basic overview of such optimization algorithms namely Artificial Bee Colony (ABC) Algorithm, Ant Colony Optimization (ACO) Algorithm, Fire-fly Algorithm (FFA) and Particle
Стилі APA, Harvard, Vancouver, ISO та ін.
9

RAO, Xiong, Run DU, Wenming CHENG, and Yi YANG. "Modified proportional topology optimization algorithm for multiple optimization problems." Mechanics 30, no. 1 (February 23, 2024): 36–45. http://dx.doi.org/10.5755/j02.mech.34367.

Повний текст джерела
Анотація:
Three modified proportional topology optimization (MPTO) algorithms are presented in this paper, which are named MPTOc, MPTOs and MPTOm, respectively. MPTOc aims to address the minimum compliance problem with volume constraint, MPTOs aims to solve the minimum volume fraction problem under stress constraint, and MPTOm aims to tackle the minimum volume fraction problem under compliance and stress constraints. In order to get rid of the shortcomings of the original proportional topology optimization (PTO) algorithm and improve the comprehensive performance of the PTO algorithm, the proposed algor
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Arıcı, FerdaNur, and Ersin Kaya. "Comparison of Meta-heuristic Algorithms on Benchmark Functions." Academic Perspective Procedia 2, no. 3 (November 22, 2019): 508–17. http://dx.doi.org/10.33793/acperpro.02.03.41.

Повний текст джерела
Анотація:
Optimization is a process to search the most suitable solution for a problem within an acceptable time interval. The algorithms that solve the optimization problems are called as optimization algorithms. In the literature, there are many optimization algorithms with different characteristics. The optimization algorithms can exhibit different behaviors depending on the size, characteristics and complexity of the optimization problem. In this study, six well-known population based optimization algorithms (artificial algae algorithm - AAA, artificial bee colony algorithm - ABC, differential evolu
Стилі APA, Harvard, Vancouver, ISO та ін.
Більше джерел

Дисертації з теми "Optimization algorithms"

1

Astete, morales Sandra. "Contributions to Convergence Analysis of Noisy Optimization Algorithms." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS327/document.

Повний текст джерела
Анотація:
Cette thèse montre des contributions à l'analyse d'algorithmes pour l'optimisation de fonctions bruitées. Les taux de convergences (regret simple et regret cumulatif) sont analysés pour les algorithmes de recherche linéaire ainsi que pour les algorithmes de recherche aléatoires. Nous prouvons que les algorithmes basé sur la matrice hessienne peuvent atteindre le même résultat que certaines algorithmes optimaux, lorsque les paramètres sont bien choisis. De plus, nous analysons l'ordre de convergence des stratégies évolutionnistes pour des fonctions bruitées. Nous déduisons une convergence log-l
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Reimann, Axel. "Evolutionary algorithms and optimization." Doctoral thesis, [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=969093497.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Parpas, Panayiotis. "Algorithms for stochastic optimization." Thesis, Imperial College London, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.434980.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Johnson, Jared. "Algorithms for Rendering Optimization." Doctoral diss., University of Central Florida, 2012. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5329.

Повний текст джерела
Анотація:
This dissertation explores algorithms for rendering optimization realizable within a modern, complex rendering engine. The first part contains optimized rendering algorithms for ray tracing. Ray tracing algorithms typically provide properties of simplicity and robustness that are highly desirable in computer graphics. We offer several novel contributions to the problem of interactive ray tracing of complex lighting environments. We focus on the problem of maintaining interactivity as both geometric and lighting complexity grows without effecting the simplicity or robustness of ray tracing. Fir
Стилі APA, Harvard, Vancouver, ISO та ін.
5

CESARI, TOMMASO RENATO. "ALGORITHMS, LEARNING, AND OPTIMIZATION." Doctoral thesis, Università degli Studi di Milano, 2020. http://hdl.handle.net/2434/699354.

Повний текст джерела
Анотація:
This thesis covers some algorithmic aspects of online machine learning and optimization. In Chapter 1 we design algorithms with state-of-the-art regret guarantees for the problem dynamic pricing. In Chapter 2 we move on to an asynchronous online learning setting in which only some of the agents in the network are active at each time step. We show that when information is shared among neighbors, knowledge about the graph structure might have a significantly different impact on learning rates depending on how agents are activated. In Chapter 3 we investigate the online problem of multivariate no
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Stults, Ian Collier. "A multi-fidelity analysis selection method using a constrained discrete optimization formulation." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31706.

Повний текст джерела
Анотація:
Thesis (Ph.D)--Aerospace Engineering, Georgia Institute of Technology, 2010.<br>Committee Chair: Mavris, Dimitri; Committee Member: Beeson, Don; Committee Member: Duncan, Scott; Committee Member: German, Brian; Committee Member: Kumar, Viren. Part of the SMARTech Electronic Thesis and Dissertation Collection.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Rafique, Abid. "Communication optimization in iterative numerical algorithms : an algorithm-architecture interaction." Thesis, Imperial College London, 2014. http://hdl.handle.net/10044/1/17837.

Повний текст джерела
Анотація:
Trading communication with redundant computation can increase the silicon efficiency of common hardware accelerators like FPGA and GPU in accelerating sparse iterative numerical algorithms. While iterative numerical algorithms are extensively used in solving large-scale sparse linear system of equations and eigenvalue problems, they are challenging to accelerate as they spend most of their time in communication-bound operations, like sparse matrix-vector multiply (SpMV) and vector-vector operations. Communication is used in a general sense to mean moving the matrix and the vectors within the c
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Dost, Banu. "Optimization algorithms for biological data." Diss., [La Jolla] : University of California, San Diego, 2010. http://wwwlib.umi.com/cr/ucsd/fullcit?p3397170.

Повний текст джерела
Анотація:
Thesis (Ph. D.)--University of California, San Diego, 2010.<br>Title from first page of PDF file (viewed March 23, 2010). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 149-159).
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Xiong, Xiaoping. "Stochastic optimization algorithms and convergence /." College Park, Md. : University of Maryland, 2005. http://hdl.handle.net/1903/2360.

Повний текст джерела
Анотація:
Thesis (Ph. D.) -- University of Maryland, College Park, 2005.<br>Thesis research directed by: Business and Management. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Quttineh, Nils-Hassan. "Algorithms for Costly Global Optimization." Licentiate thesis, Mälardalen University, School of Education, Culture and Communication, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-5970.

Повний текст джерела
Анотація:
<p>There exists many applications with so-called costly problems, which means that the objective function you want to maximize or minimize cannot be described using standard functions and expressions. Instead one considers these objective functions as ``black box'' where the parameter values are sent in and a function value is returned. This implies in particular that no derivative information is available.The reason for describing these problems as expensive is that it may take a long time to calculate a single function value. The black box could, for example, solve a large system of differen
Стилі APA, Harvard, Vancouver, ISO та ін.
Більше джерел

Книги з теми "Optimization algorithms"

1

Spedicato, Emilio, ed. Algorithms for Continuous Optimization. Dordrecht: Springer Netherlands, 1994. http://dx.doi.org/10.1007/978-94-009-0369-2.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Cheng, Shi, and Yuhui Shi, eds. Brain Storm Optimization Algorithms. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-15070-9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

L, Kreher Donald, ed. Graphs, algorithms, and optimization. Boca Raton: Chapman & Hall/CRC, 2005.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

1939-, Dennis J. E., and Institute for Computer Applications in Science and Engineering., eds. Algorithms for bilevel optimization. Hampton, VA: Institute for Computer Applications in Science and Engineering, NASA Langley Research Center, 1994.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Heiko, Rieger, ed. Optimization algorithms in physics. Berlin: Wiley-VCH, 2002.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Pereira, Ana I., Florbela P. Fernandes, João P. Coelho, João P. Teixeira, Maria F. Pacheco, Paulo Alves, and Rui P. Lopes, eds. Optimization, Learning Algorithms and Applications. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-91885-9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Grötschel, Martin, László Lovász, and Alexander Schrijver. Geometric Algorithms and Combinatorial Optimization. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-642-78240-4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Uryasev, Stanislav, and Panos M. Pardalos, eds. Stochastic Optimization: Algorithms and Applications. Boston, MA: Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-6594-6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Jansen, Klaus, and José Rolim, eds. Approximation Algorithms for Combinatiorial Optimization. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0053958.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Migdalas, Athanasios, Panos M. Pardalos, and Peter Värbrand, eds. Multilevel Optimization: Algorithms and Applications. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4613-0307-7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Більше джерел

Частини книг з теми "Optimization algorithms"

1

Löhne, Andreas. "Algorithms." In Vector Optimization, 161–95. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-18351-5_6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Khamehchi, Ehsan, and Mohammad Reza Mahdiani. "Optimization Algorithms." In SpringerBriefs in Petroleum Geoscience & Engineering, 35–46. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-51451-2_4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Chen, Po, and En-Jui Lee. "Optimization Algorithms." In Full-3D Seismic Waveform Inversion, 311–43. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-16604-9_5.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Yang, Xin-She. "Optimization Algorithms." In Computational Optimization, Methods and Algorithms, 13–31. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20859-1_2.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Buljak, Vladimir. "Optimization Algorithms." In Computational Fluid and Solid Mechanics, 19–83. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22703-5_2.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Dolzhenko, Viktoria. "Optimization Algorithms." In Algorithmic Trading Systems and Strategies: A New Approach, 215–64. Berkeley, CA: Apress, 2024. http://dx.doi.org/10.1007/979-8-8688-0357-4_6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Rong, Hai-Jun, and Zhao-Xu Yang. "Optimization Algorithms." In Sequential Intelligent Dynamic System Modeling and Control, 29–44. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1541-1_3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Liquet, Benoit, Sarat Moka, and Yoni Nazarathy. "Optimization Algorithms." In Mathematical Engineering of Deep Learning, 111–63. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003298687-4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Stefanov, Stefan M. "The Algorithms." In Applied Optimization, 159–74. Boston, MA: Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-3417-1_7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Virant, Jernej. "Fuzzy Algorithms." In Applied Optimization, 65–78. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/978-1-4615-4673-3_4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Optimization algorithms"

1

Anjum, Ishraq Md, Davorin Peceli, Francesco Capuano, and Bedrich Rus. "High-Power Laser Pulse Shape Optimization with Hybrid Stochastic Optimization Algorithms." In Frontiers in Optics, JD4A.55. Washington, D.C.: Optica Publishing Group, 2024. https://doi.org/10.1364/fio.2024.jd4a.55.

Повний текст джерела
Анотація:
We evaluate five optimization algorithms for laser pulse temporal shape optimization, using a semi-physical model of a high-power laser. Hybrid algorithms combine Differential Evolution and Bayesian optimization algorithm exploration with Nelder-Mead exploitation, exhibiting superior performance.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Yang, Fan, and Shixuan Wang. "Optimization of Denoising Algorithms." In 2024 3rd International Conference on Artificial Intelligence and Computer Information Technology (AICIT), 1–4. IEEE, 2024. http://dx.doi.org/10.1109/aicit62434.2024.10730223.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

V, Manojkumar, and R. Mahalakshmi. "Test Case Optimization Through Ant Colony Optimization Enabled Boosted Regression Model." In 2024 International Conference on Intelligent Algorithms for Computational Intelligence Systems (IACIS), 1–6. IEEE, 2024. http://dx.doi.org/10.1109/iacis61494.2024.10721935.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Wang, Hui, Pei Xiong, and Xinwu Wu. "Optimization analysis of mine car scheduling based on fruit fly optimization algorithm and improved immune particle swarm optimization algorithm." In Fourth International Conference on Advanced Algorithms and Neural Networks (AANN 2024), edited by Qinghua Lu and Weishan Zhang, 87. SPIE, 2024. http://dx.doi.org/10.1117/12.3049683.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Li, Jiahe, Erick Gomez, Mahmoud Almasri, Ed Kinzel, and Derek T. Anderson. "Differentiability-oriented multispectral sensor optimization." In Algorithms, Technologies, and Applications for Multispectral and Hyperspectral Imaging XXXI, edited by David W. Messinger and Miguel Velez-Reyes, 6. SPIE, 2025. https://doi.org/10.1117/12.3053468.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Jjng, Tianxu, Hailei Meng, Feng Pang, Xiaojun Dou, and Yue Liu. "Research on UAV nest assignment optimization based on particle swarm optimization algorithm." In Fourth International Conference on Advanced Algorithms and Neural Networks (AANN 2024), edited by Qinghua Lu and Weishan Zhang, 118. SPIE, 2024. http://dx.doi.org/10.1117/12.3049818.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Fadhil, Heba. "Metaheuristic Algorithms in Optimization and its Application: A Review." In The 3rd International Conference On Engineering And Innovative Technology. Salahaddin University-Erbil, 2025. https://doi.org/10.31972/iceit2024.013.

Повний текст джерела
Анотація:
Metaheuristic algorithms are an intelligent way of thinking and working developed for resolving diverse issues about optimization. The number of potential solutions for such problems often is too large to be properly analyzed using standard procedures; thus, these algorithms are highly flexible and can be useful in many cases where needed to predict different types of optimizations accurately. Metaheuristics take inspiration from several natural processes like evolution or animal behavior, which allow them to show strength without being specific only towards one area. Some Metaheuristics algor
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Liu, Jihong, and Sen Zeng. "A Survey of Assembly Planning Based on Intelligent Optimization Algorithms." In ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49445.

Повний текст джерела
Анотація:
Assembly planning is one of the NP complete problems, which is even more difficult to solve for complex products. Intelligent optimization algorithms have obvious advantages to deal with such combinatorial problems. Various intelligent optimization algorithms have been applied to assembly sequence planning and optimization in the last decade. This paper surveys the state-of-the-art of the assembly planning methods based on the intelligent optimization algorithms. Five intelligent optimization algorithms, i.e. genetic algorithm (GA), artificial neural networks (ANN), simulated annealing (SA), a
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Hamann, Hendrik F. "Optimization Algorithms for Energy-Efficient Data Centers." In ASME 2013 International Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Microsystems. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/ipack2013-73066.

Повний текст джерела
Анотація:
Real-time optimization algorithms for managing energy efficiency in data centers have been developed and implemented. For example, for a given cooling configuration (which is being measured and modeled in real-time using IBM’s Measurement and Management Technologies) an optimization algorithm allows identifying the optimum placement of new servers (or workloads) or alternatively where to remove servers (or workload) for different constraints. Another optimization algorithm optimizes performance of the data center without creating hotspots. The optimization algorithms use a physical model in ju
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Mekhilef, Mounib, and Mohamed B. Trabia. "Successive Twinkling Simplex Search Optimization Algorithms." In ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/dac-21132.

Повний текст джерела
Анотація:
Abstract Simplex algorithms have been proven to be a reliable nonlinear programming pattern search algorithm. The effectiveness of simplex however reduces when the solved problem has large number of variables, several local minima, or when initial guess is not readily available. Recent results obtained by introducing a technique of random selection of the variables in optimization processes, encouraged studying the effect of this idea on the Nelder &amp; Mead version of the simplex algorithm to improve its semi-global behavior. This paper proposes several enhancements to the simplex. The algor
Стилі APA, Harvard, Vancouver, ISO та ін.

Звіти організацій з теми "Optimization algorithms"

1

Parekh, Ojas D., Ciaran Ryan-Anderson, and Sevag Gharibian. Quantum Optimization and Approximation Algorithms. Office of Scientific and Technical Information (OSTI), January 2019. http://dx.doi.org/10.2172/1492737.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, July 1988. http://dx.doi.org/10.21236/ada204389.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Prieto, Francisco J. Sequential Quadratic Programming Algorithms for Optimization. Fort Belvoir, VA: Defense Technical Information Center, August 1989. http://dx.doi.org/10.21236/ada212800.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, July 1986. http://dx.doi.org/10.21236/ada182531.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Mifflin, R. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, July 1985. http://dx.doi.org/10.21236/ada159168.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, December 1990. http://dx.doi.org/10.21236/ada231110.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Prieto, F. Sequential quadratic programming algorithms for optimization. Office of Scientific and Technical Information (OSTI), August 1989. http://dx.doi.org/10.2172/5325989.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Apostolatos, A., B. Keith, C. Soriano, and R. Rossi. D6.1 Deterministic optimization software. Scipedia, 2021. http://dx.doi.org/10.23967/exaqute.2021.2.018.

Повний текст джерела
Анотація:
This deliverable focuses on the implementation of deterministic optimization algorithms and problem solvers within KRATOS open-source software. One of the main challenges of optimization algorithms in Finite-Element based optimization is how to get the gradient of response functions which are used as objective and constraints when this is not available in an explicit form. The idea is to use local sensitivity analysis to get the gradient of the response function(s)
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Plotkin, Serge. Research in Graph Algorithms and Combinatorial Optimization. Fort Belvoir, VA: Defense Technical Information Center, March 1995. http://dx.doi.org/10.21236/ada292630.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Nocedal, J. Algorithms and software for large scale optimization. Office of Scientific and Technical Information (OSTI), May 1990. http://dx.doi.org/10.2172/5688791.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!