To see the other types of publications on this topic, follow the link: Inexact line searches.

Journal articles on the topic 'Inexact line searches'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Inexact line searches.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Shi, Zhen-Jun, and Jie Shen. "A gradient-related algorithm with inexact line searches." Journal of Computational and Applied Mathematics 170, no. 2 (2004): 349–70. http://dx.doi.org/10.1016/j.cam.2003.10.025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Y. Al-Bayati, Abbas, and Basim A. Hassan. "A Spectral Conjugate Gradient Method with Inexact line searches." IRAQI JOURNAL OF STATISTICAL SCIENCES 11, no. 20 (2011): 155–63. http://dx.doi.org/10.33899/iqjoss.2011.27895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Y. Al-Bayati, Abbas, and Basim A. Hassan. "A Spectral Conjugate Gradient Method with Inexact line searches." IRAQI JOURNAL OF STATISTICAL SCIENCES 11, no. 2 (2011): 155–63. http://dx.doi.org/10.33899/iqjoss.2011.027895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Al-Assady, N. H., and A. Y. Al-Bayati. "Minimization of extended quadratic functions with inexact line searches." Journal of Optimization Theory and Applications 82, no. 1 (1994): 139–47. http://dx.doi.org/10.1007/bf02191784.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Burachik, R., L. M. Graña Drummond, A. N. Iusem, and B. F. Svaiter. "Full convergence of the steepest descent method with inexact line searches." Optimization 32, no. 2 (1995): 137–46. http://dx.doi.org/10.1080/02331939508844042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sherali, Hanif D., and Osman Ulular. "Conjugate gradient methods using quasi-Newton updates with inexact line searches." Journal of Mathematical Analysis and Applications 150, no. 2 (1990): 359–77. http://dx.doi.org/10.1016/0022-247x(90)90109-s.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Al-Bayati, Abbas Younis, and Muna M. M. Ali. "New multi-step three-term conjugate gradient algorithms with inexact line searches." Indonesian Journal of Electrical Engineering and Computer Science 19, no. 3 (2020): 1564. http://dx.doi.org/10.11591/ijeecs.v19.i3.pp1564-1573.

Full text
Abstract:
<p>This work suggests several multi-step three-term Conjugate Gradient (CG)-algorithms that satisfies their sufficient descent property and conjugacy conditions. First, we have considered a number of well-known three-term CG-method, and we have, therefore, suggested two new classes of this type of algorithms which was based on Hestenes and Stiefel (HS) and Polak-Ribière (PR) formulas with four different versions. Both descent and conjugacy conditions for all the proposed algorithms are satisfied, at each iteration by using the strong Wolfe line search condition and it's accelerated versi
APA, Harvard, Vancouver, ISO, and other styles
8

Abbas, Y. Al-Bayati, and M. M. Ali Muna. "New multi-step three-term conjugate gradient algorithms with inexact line searches." Indonesian Journal of Electrical Engineering and Computer Science 19, no. 3 (2020): 1564–73. https://doi.org/10.11591/ijeecs.v19.i3.pp1564-1573.

Full text
Abstract:
This work suggests several multi-step three-term Conjugate Gradient (CG)- algorithms that satisfies their sufficient descent property and conjugacy conditions. First, we considered (39) well-known three-term CG-method, and we have, therefore, suggested two new classes of this type of algorithms based on Hestenes and Stiefel (HS) and Polak-Ribière (PR) formulas with four different versions. Both descent and conjugacy conditions for all the proposed algorithms are satisfied, at each iteration by using the strong Wolfe line search condition and it's accelerated version. These new sugge
APA, Harvard, Vancouver, ISO, and other styles
9

Al-Saidi, Amal. "Improved Fletcher-Reeves Methods Based on New Scaling Techniques." Sultan Qaboos University Journal for Science [SQUJS] 26, no. 2 (2021): 141–51. http://dx.doi.org/10.53539/squjs.vol26iss2pp141-151.

Full text
Abstract:
This paper introduces a scaling parameter to the Fletcher-Reeves (FR) nonlinear conjugate gradient method. The main aim is to improve its theoretical and numerical properties when applied with inexact line searches to unconstrained optimization problems. We show that the sufficient descent and global convergence properties of Al-Baali for the FR method with a fairly accurate line search are maintained. We also consider the possibility of extending this result to less accurate line search for appropriate values of the scaling parameter. The reported numerical results show that several values fo
APA, Harvard, Vancouver, ISO, and other styles
10

Nazareth, J. L. "Analogues of Dixon’s and Powell’s Theorems for Unconstrained Minimization with Inexact Line Searches." SIAM Journal on Numerical Analysis 23, no. 1 (1986): 170–77. http://dx.doi.org/10.1137/0723012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Younis, Maha. "A New Restarting Criterion for FR-CG Method with Exact and Inexact Line Searches." AL-Rafidain Journal of Computer Sciences and Mathematics 5, no. 2 (2008): 95–110. http://dx.doi.org/10.33899/csmj.2008.163975.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Rivaie, Mohd, Mustafa Mamat, and Abdelrhaman Abashar. "A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches." Applied Mathematics and Computation 268 (October 2015): 1152–63. http://dx.doi.org/10.1016/j.amc.2015.07.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ghani, N. H. A., N. S. Mohamed, N. Zull, S. Shoid, M. Rivaie, and M. Mamat. "Performance comparison of a new hybrid conjugate gradient method under exact and inexact line searches." Journal of Physics: Conference Series 890 (September 2017): 012106. http://dx.doi.org/10.1088/1742-6596/890/1/012106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Deng, N. Y., and Z. F. Li. "some global convergence properties of a conic-variable metric algorithm for minimization with inexact line searches." Optimization Methods and Software 5, no. 1 (1995): 105–22. http://dx.doi.org/10.1080/10556789508805604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Ariyawansa, K. A., and N. Begashaw. "Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex Functions." Computing 63, no. 2 (1999): 145–69. http://dx.doi.org/10.1007/s006070050056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Li, Pengyuan, Junyu Lu, and Haishan Feng. "The Global Convergence of a Modified BFGS Method under Inexact Line Search for Nonconvex Functions." Mathematical Problems in Engineering 2021 (May 4, 2021): 1–9. http://dx.doi.org/10.1155/2021/8342536.

Full text
Abstract:
Among the quasi-Newton algorithms, the BFGS method is often discussed by related scholars. However, in the case of inexact Wolfe line searches or even exact line search, the global convergence of the BFGS method for nonconvex functions is not still proven. Based on the aforementioned issues, we propose a new quasi-Newton algorithm to obtain a better convergence property; it is designed according to the following essentials: (1) a modified BFGS formula is designed to guarantee that B k + 1 inherits the positive definiteness of B k ; (2) a modified weak Wolfe–Powell line search is recommended; (
APA, Harvard, Vancouver, ISO, and other styles
17

Beltracchi, T. J., and G. A. Gabriele. "A Hybrid Variable Metric Update for the Recursive Quadratic Programming Method." Journal of Mechanical Design 113, no. 3 (1991): 280–85. http://dx.doi.org/10.1115/1.2912780.

Full text
Abstract:
The Recursive Quadratic Programming (RQP) method has become known as one of the most effective and efficient algorithms for solving engineering optimization problems. The RQP method uses variable metric updates to build approximations of the Hessian of the Lagrangian. If the approximation of the Hessian of the Lagrangian converges to the true Hessian of the Lagrangian, then the RQP method converges quadratically. The choice of a variable metric update has a direct effect on the convergence of the Hessian approximation. Most of the research performed with the RQP method uses some modification o
APA, Harvard, Vancouver, ISO, and other styles
18

Ivan, Latif S., and Mohammed J. Lajan. "Global convergence of new modified CG method with inexact line search." Journal of Zankoy Sulaimani - Part A 16, no. 2 (2014): 17–26. http://dx.doi.org/10.17656/jzs.10290.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

He, Yang Jun, and Gui Jun Zhang. "Global Optimization of Tersoff Clusters Using Differential Evolution with Inexact Line Search." Applied Mechanics and Materials 48-49 (February 2011): 565–68. http://dx.doi.org/10.4028/www.scientific.net/amm.48-49.565.

Full text
Abstract:
Differential Evolution with Inexact Line Search (DEILS) is proposed to determination of the ground-state geometry of atom clusters. DEILS algorithm adopts probabilistic inexact line search method in acceptance rule of differential evolution to accelerate the convergence as the region of global minimum is approached. More realistic many-body potential energy functions, namely the Tersoff and Tersoff-like semi-empirical potentials for silicon, are considered. Numerical studies indicate that the new algorithm is considerably faster and more reliable than original differential evolution algorithm,
APA, Harvard, Vancouver, ISO, and other styles
20

Liu, Meiling, Xueqian Li, and Qinmin Wu. "A Filter Algorithm with Inexact Line Search." Mathematical Problems in Engineering 2012 (2012): 1–20. http://dx.doi.org/10.1155/2012/349178.

Full text
Abstract:
A filter algorithm with inexact line search is proposed for solving nonlinear programming problems. The filter is constructed by employing the norm of the gradient of the Lagrangian function to the infeasibility measure. Transition to superlinear local convergence is showed for the proposed filter algorithm without second-order correction. Under mild conditions, the global convergence can also be derived. Numerical experiments show the efficiency of the algorithm.
APA, Harvard, Vancouver, ISO, and other styles
21

Shi, Z. J., and J. Shen. "New Inexact Line Search Method for Unconstrained Optimization." Journal of Optimization Theory and Applications 127, no. 2 (2005): 425–46. http://dx.doi.org/10.1007/s10957-005-6553-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Zhujun, and Li Cai. "A Class of Inexact Secant Algorithms with Line Search Filter Method for Nonlinear Programming." Mathematical Problems in Engineering 2021 (November 10, 2021): 1–9. http://dx.doi.org/10.1155/2021/6253424.

Full text
Abstract:
We propose a class of inexact secant methods in association with the line search filter technique for solving nonlinear equality constrained optimization. Compared with other filter methods that combine the line search method applied in most large-scale optimization problems, the inexact line search filter algorithm is more flexible and realizable. In this paper, we focus on the analysis of the local superlinear convergence rate of the algorithms, while their global convergence properties can be obtained by making an analogy with our previous work. These methods have been implemented in a Matl
APA, Harvard, Vancouver, ISO, and other styles
23

Ishak, M. I., S. M. Marjugi, and L. W. June. "A new modified conjugate gradient method under the strong Wolfe line search for solving unconstrained optimization problems." Mathematical Modeling and Computing 9, no. 1 (2022): 111–18. http://dx.doi.org/10.23939/mmc2022.01.111.

Full text
Abstract:
Conjugate gradient (CG) method is well-known due to efficiency to solve the problems of unconstrained optimization because of its convergence properties and low computation cost. Nowadays, the method is widely developed to compete with existing methods in term of their efficiency. In this paper, a modification of CG method will be proposed under strong Wolfe line search. A new CG coefficient is presented based on the idea of make use some parts of the previous existing CG methods to retain the advantages. The proposed method guarantees that the sufficient descent condition holds and globally c
APA, Harvard, Vancouver, ISO, and other styles
24

Al-Bayati, Abbas, and Ivan Latif. "A New Preconditioned Inexact Line-Search Technique for Unconstrained Optimization." AL-Rafidain Journal of Computer Sciences and Mathematics 9, no. 2 (2012): 25–39. http://dx.doi.org/10.33899/csmj.2012.163698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Su, Kaikai, and Anhua Guo. "Hardware implementation of the DFP algorithm using inexact line search." Journal of Physics: Conference Series 2010, no. 1 (2021): 012081. http://dx.doi.org/10.1088/1742-6596/2010/1/012081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Bonettini, S., I. Loris, F. Porta, and M. Prato. "Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization." SIAM Journal on Optimization 26, no. 2 (2016): 891–921. http://dx.doi.org/10.1137/15m1019325.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Shi, Zhen-Jun. "Convergence of quasi-Newton method with new inexact line search." Journal of Mathematical Analysis and Applications 315, no. 1 (2006): 120–31. http://dx.doi.org/10.1016/j.jmaa.2005.05.077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Fischer, Andreas, and Ana Friedlander. "A new line search inexact restoration approach for nonlinear programming." Computational Optimization and Applications 46, no. 2 (2009): 333–46. http://dx.doi.org/10.1007/s10589-009-9267-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Ahmed, Chergui, and Bouali Tahar. "Global convergence of new conjugate gradient method with inexact line search." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 2 (2021): 1469. http://dx.doi.org/10.11591/ijece.v11i2.pp1469-1475.

Full text
Abstract:
In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.
APA, Harvard, Vancouver, ISO, and other styles
30

Chergui, Ahmed, and Taher Bouali. "Global convergence of new conjugate gradient method with inexact line search." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 2 (2021): 1469–75. https://doi.org/10.11591/ijece.v11i2.pp1469-1475.

Full text
Abstract:
In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and  FR methods.
APA, Harvard, Vancouver, ISO, and other styles
31

Alkhouli, Talat. "An Efficient Hybrid Conjugate Gradient Coefficient u nder Inexact Line Search." International Journal of Advanced Trends in Computer Science and Engineering 9, no. 1 (2020): 784–88. http://dx.doi.org/10.30534/ijatcse/2020/112912020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Ning, Ma, Sun Juhe, and Xu Liang. "Fatigue S-N Curve Fitting with Inexact Line Search Newton’s Method." Information Technology Journal 12, no. 18 (2013): 4404–7. http://dx.doi.org/10.3923/itj.2013.4404.4407.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hamoda, Mohamed, Mohd Rivaie, Mustafa Mamat, and Zabidin Salleh. "A conjugate gradient method with inexact line search for unconstrained optimization." Applied Mathematical Sciences 9 (2015): 1823–32. http://dx.doi.org/10.12988/ams.2015.411995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

von Heusinger, A., and C. Kanzow. "Relaxation Methods for Generalized Nash Equilibrium Problems with Inexact Line Search." Journal of Optimization Theory and Applications 143, no. 1 (2009): 159–83. http://dx.doi.org/10.1007/s10957-009-9553-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Wang, Zhujun, Li Cai, and Detong Zhu. "Line search filter inexact secant methods for nonlinear equality constrained optimization." Applied Mathematics and Computation 263 (July 2015): 47–58. http://dx.doi.org/10.1016/j.amc.2015.04.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Yang, Xiaoguang. "On the convergence of multiplicative iterative algorithms with inexact line search." Acta Mathematicae Applicatae Sinica 13, no. 4 (1997): 337–41. http://dx.doi.org/10.1007/bf02009541.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Fadhilah, Nurul Hafawati, Mohd Rivaie, Fuziyah Ishak, and Nur Idalisa. "New Three-Term Conjugate Gradient Method with Exact Line Search." MATEMATIKA 36, no. 3 (2020): 197–207. http://dx.doi.org/10.11113/matematika.v36.n3.1214.

Full text
Abstract:
Conjugate Gradient (CG) methods have an important role in solving largescale unconstrained optimization problems. Nowadays, the Three-Term CG method hasbecome a research trend of the CG methods. However, the existing Three-Term CGmethods could only be used with the inexact line search. When the exact line searchis applied, this Three-Term CG method will be reduced to the standard CG method.Hence in this paper, a new Three-Term CG method that could be used with the exactline search is proposed. This new Three-Term CG method satisfies the descent conditionusing the exact line search. Performance
APA, Harvard, Vancouver, ISO, and other styles
38

Cai, Shang-Rong, and Feng-Nan Hwang. "A hybrid-line-and-curve search globalization technique for inexact Newton methods." Applied Numerical Mathematics 173 (March 2022): 79–93. http://dx.doi.org/10.1016/j.apnum.2021.11.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Xianjun, Long, Wang Xiaoting, and Li Gaoxi. "Inexact Proximal Quasi-Newton Algorithms with Line Search for Fractional Optimization Problems." Pacific Journal of Optimization 2024 (2024): 21. http://dx.doi.org/10.61208/pjo-2024-021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Amini, Keyvan, Masoud Ahookhosh, and Hadi Nosratipour. "An inexact line search approach using modified nonmonotone strategy for unconstrained optimization." Numerical Algorithms 66, no. 1 (2013): 49–78. http://dx.doi.org/10.1007/s11075-013-9723-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Cai, Li, and Detong Zhu. "A line search filter inexact SQP method for nonlinear equality constrained optimization." Journal of Systems Science and Complexity 25, no. 5 (2012): 950–63. http://dx.doi.org/10.1007/s11424-012-0018-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Hu, X. D. "A counterexample of the convergence of Rosen's algorithm with inexact line search." Operations Research Letters 13, no. 2 (1993): 95–97. http://dx.doi.org/10.1016/0167-6377(93)90035-f.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Halilu, Abubakar, Mohammed Waziri, and Ibrahim Yusuf. "Efficient matrix-free direction method with line search for solving large-scale system of nonlinear equations." Yugoslav Journal of Operations Research 30, no. 4 (2020): 399–412. http://dx.doi.org/10.2298/yjor160515005h.

Full text
Abstract:
We proposed a matrix-free direction with an inexact line search technique to solve system of nonlinear equations by using double direction approach. In this article, we approximated the Jacobian matrix by appropriately constructed matrix-free method via acceleration parameter. The global convergence of our method is established under mild conditions. Numerical comparisons reported in this paper are based on a set of large-scale test problems and show that the proposed method is efficient for large-scale problems.
APA, Harvard, Vancouver, ISO, and other styles
44

Li, Xiangrong, Songhua Wang, Zhongzhou Jin, and Hongtruong Pham. "A Conjugate Gradient Algorithm under Yuan-Wei-Lu Line Search Technique for Large-Scale Minimization Optimization Models." Mathematical Problems in Engineering 2018 (2018): 1–11. http://dx.doi.org/10.1155/2018/4729318.

Full text
Abstract:
This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: (1) the new search direction possesses not only a sufficient descent property but also a trust region feature; (2) the presented algorithm has global convergence for nonconvex functions; (3) the numerical experiment showed that the new algorithm is more effective than similar algorithms.
APA, Harvard, Vancouver, ISO, and other styles
45

Al-Naemi, Ghada Moayid. "A new modified HS algorithm with strong Powell-Wolfe line search for unconstrained optimization." Eastern-European Journal of Enterprise Technologies 2, no. 4 (116) (2022): 14–21. http://dx.doi.org/10.15587/1729-4061.2022.254017.

Full text
Abstract:
Optimization is now considered a branch of computational science. This ethos seeks to answer the question «what is best?» by looking at problems where the quality of any answer can be expressed numerically. One of the most well-known methods for solving nonlinear, unrestricted optimization problems is the conjugate gradient (CG) method. The Hestenes and Stiefel (HS-CG) formula is one of the century’s oldest and most effective formulas. When using an exact line search, the HS method achieves global convergence; however, this is not guaranteed when using an inexact line search (ILS). Furthermore
APA, Harvard, Vancouver, ISO, and other styles
46

Ghada, Moayid Al-Naemi. "A new modified HS algorithm with strong Powell-Wolfe line search for unconstrained optimization." Eastern-European Journal of Enterprise Technologies 2, no. 4 (116) (2022): 14–21. https://doi.org/10.15587/1729-4061.2022.254017.

Full text
Abstract:
Optimization is now considered a branch of computational science. This ethos seeks to answer the question «what is best?» by looking at problems where the quality of any answer can be expressed numerically. One of the most well-known methods for solving nonlinear, unrestricted optimization problems is the conjugate gradient (CG) method. The Hestenes and Stiefel (HS-CG) formula is one of the century’s oldest and most effective formulas. When using an exact line search, the HS method achieves global convergence; however, this is not guaranteed when using an inexact line search
APA, Harvard, Vancouver, ISO, and other styles
47

Fakhri, Muhammad Imza, Mohd Rivaie Mohd Ali, and Ibrahim Jusoh. "An n-th section line search in conjugate gradient method for small-scale unconstrained optimization." Malaysian Journal of Fundamental and Applied Sciences 13, no. 4 (2017): 588–92. http://dx.doi.org/10.11113/mjfas.v0n0.579.

Full text
Abstract:
Conjugate Gradient (CG) methods are well-known method for solving unconstrained optimization problem and popular for its low memory requirement. A lot of researches and efforts have been done in order to improve the efficiency of this CG method. In this paper, a new inexact line search is proposed based on Bisection line search. Initially, Bisection method is the easiest method to solve root of a function. Thus, it is an ideal method to employ in CG method. This new modification is named n-th section. In a nutshell, this proposed method is promising and more efficient compared to the original
APA, Harvard, Vancouver, ISO, and other styles
48

严, 小快. "A Full Waveform Inversion Based on Inexact Monotone and Non-Monotone Line Search." Operations Research and Fuzziology 11, no. 01 (2021): 19–28. http://dx.doi.org/10.12677/orf.2021.111004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Neto, Elias Salomão Helou, and Álvaro Rodolfo De Pierro. "On perturbed steepest descent methods with inexact line search for bilevel convex optimization." Optimization 60, no. 8-9 (2011): 991–1008. http://dx.doi.org/10.1080/02331934.2010.536231.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Liu, Jinghui, and Changfeng Ma. "A nonmonotone trust region method with new inexact line search for unconstrained optimization." Numerical Algorithms 64, no. 1 (2012): 1–20. http://dx.doi.org/10.1007/s11075-012-9652-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!