Academic literature on the topic 'Convex minimization'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Convex minimization.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Convex minimization"
Li, Duan, Zhi-You Wu, Heung-Wing Joseph Lee, Xin-Min Yang, and Lian-Sheng Zhang. "Hidden Convex Minimization." Journal of Global Optimization 31, no. 2 (February 2005): 211–33. http://dx.doi.org/10.1007/s10898-004-5697-5.
Full textMayeli, Azita. "Non-convex Optimization via Strongly Convex Majorization-minimization." Canadian Mathematical Bulletin 63, no. 4 (December 10, 2019): 726–37. http://dx.doi.org/10.4153/s0008439519000730.
Full textScarpa, Luca, and Ulisse Stefanelli. "Stochastic PDEs via convex minimization." Communications in Partial Differential Equations 46, no. 1 (October 14, 2020): 66–97. http://dx.doi.org/10.1080/03605302.2020.1831017.
Full textThach, P. T. "Convex minimization under Lipschitz constraints." Journal of Optimization Theory and Applications 64, no. 3 (March 1990): 595–614. http://dx.doi.org/10.1007/bf00939426.
Full textMifflin, Robert, and Claudia Sagastizábal. "A -algorithm for convex minimization." Mathematical Programming 104, no. 2-3 (July 14, 2005): 583–608. http://dx.doi.org/10.1007/s10107-005-0630-3.
Full textShioura, Akiyoshi. "Minimization of an M-convex function." Discrete Applied Mathematics 84, no. 1-3 (May 1998): 215–20. http://dx.doi.org/10.1016/s0166-218x(97)00140-6.
Full textO'Hara, John G., Paranjothi Pillay, and Hong-Kun Xu. "Iterative Approaches to Convex Minimization Problems." Numerical Functional Analysis and Optimization 25, no. 5-6 (January 2004): 531–46. http://dx.doi.org/10.1081/nfa-200041707.
Full textYe, Qiaolin, Chunxia Zhao, Ning Ye, and Xiaobo Chen. "Localized twin SVM via convex minimization." Neurocomputing 74, no. 4 (January 2011): 580–87. http://dx.doi.org/10.1016/j.neucom.2010.09.015.
Full textAkagi, Goro, and Ulisse Stefanelli. "Doubly Nonlinear Equations as Convex Minimization." SIAM Journal on Mathematical Analysis 46, no. 3 (January 2014): 1922–45. http://dx.doi.org/10.1137/13091909x.
Full textStefanov, Stefan M. "Convex separable minimization with box constraints." PAMM 7, no. 1 (December 2007): 2060045–46. http://dx.doi.org/10.1002/pamm.200700535.
Full textDissertations / Theses on the topic "Convex minimization"
NediÄ, Angelia. "Subgradient methods for convex minimization." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/16843.
Full textIncludes bibliographical references (p. 169-174).
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Many optimization problems arising in various applications require minimization of an objective cost function that is convex but not differentiable. Such a minimization arises, for example, in model construction, system identification, neural networks, pattern classification, and various assignment, scheduling, and allocation problems. To solve convex but not differentiable problems, we have to employ special methods that can work in the absence of differentiability, while taking the advantage of convexity and possibly other special structures that our minimization problem may possess. In this thesis, we propose and analyze some new methods that can solve convex (not necessarily differentiable) problems. In particular, we consider two classes of methods: incremental and variable metric.
by Angelia NediÄ.
Ph.D.
Apidopoulos, Vasileios. "Inertial Gradient-Descent algorithms for convex minimization." Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0175/document.
Full textThis Thesis focuses on the study of inertial methods for solving composite convex minimization problems. Since the early works of Polyak and Nesterov, inertial methods become very popular, thanks to their acceleration effects. Here, we study a family of Nesterov-type inertial proximalgradient algorithms with a particular over-relaxation sequence. We give a unified presentation about the different convergence properties of this family of algorithms, depending on the over-relaxation parameter. In addition we addressing this issue, in the case of a smooth function with additional geometrical structure, such as the growth (or Łojasiewicz) condition. We show that by combining growth condition and a flatness-type condition on the geometry of the minimizing function, we are able to obtain some new convergence rates. Our analysis follows a continuous-to-discrete trail, passing from continuous-on time-dynamical systems to discrete schemes. In particular the family of inertial algorithms that interest us, can be identified as a finite difference scheme of a differential equation/inclusion. This approach provides a useful guideline, which permits to transpose the different results and their proofs from the continuous system to the discrete one. This opens the way for new possible inertial schemes, derived by the same dynamical system
Gräser, Carsten [Verfasser]. "Convex minimization and phase field models / Carsten Gräser." Berlin : Freie Universität Berlin, 2011. http://d-nb.info/1026174848/34.
Full textEl, Gheche Mireille. "Proximal methods for convex minimization of Phi-divergences : application to computer vision." Thesis, Paris Est, 2014. http://www.theses.fr/2014PEST1018/document.
Full textConvex optimization aims at searching for the minimum of a convex function over a convex set. While the theory of convex optimization has been largely explored for about a century, several related developments have stimulated a new interest in the topic. The first one is the emergence of efficient optimization algorithms, such as proximal methods, which allow one to easily solve large-size nonsmooth convex problems in a parallel manner. The second development is the discovery of the fact that convex optimization problems are more ubiquitous in practice than was thought previously. In this thesis, we address two different problems within the framework of convex optimization. The first one is an application to computer stereo vision, where the goal is to recover the depth information of a scene from a pair of images taken from the left and right positions. The second one is the proposition of new mathematical tools to deal with convex optimization problems involving information measures, where the objective is to minimize the divergence between two statistical objects such as random variables or probability distributions. We propose a convex approach to address the problem of dense disparity estimation under varying illumination conditions. A convex energy function is derived for jointly estimating the disparity and the illumination variation. The resulting problem is tackled in a set theoretic framework and solved using proximal tools. It is worth emphasizing the ability of this method to process multicomponent images under illumination variation. The conducted experiments indicate that this approach can effectively deal with the local illumination changes and yields better results compared with existing methods. We then extend the previous approach to the problem of multi-view disparity estimation. Rather than estimating a single depth map, we estimate a sequence of disparity maps, one for each input image. We address this problem by adopting a discrete reformulation that can be efficiently solved through a convex relaxation. This approach offers the advantage of handling both convex and nonconvex similarity measures within the same framework. We have shown that the additional complexity required by the application of our method to the multi-view case is small with respect to the stereo case. Finally, we have proposed a novel approach to handle a broad class of statistical distances, called $varphi$-divergences, within the framework of proximal algorithms. In particular, we have developed the expression of the proximity operators of several $varphi$-divergences, such as Kulback-Leibler, Jeffrey-Kulback, Hellinger, Chi-Square, I$_{alpha}$, and Renyi divergences. This allows proximal algorithms to deal with problems involving such divergences, thus overcoming the limitations of current state-of-the-art approaches for similar problems. The proposed approach is validated in two different contexts. The first is an application to image restoration that illustrates how to employ divergences as a regularization term, while the second is an application to image registration that employs divergences as a data fidelity term
Doto, James William. "Conditional uniform convexity in Orlicz spaces and minimization problems." Thesis, Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/27352.
Full textCroxton, Keely L., Bernard Gendon, and Thomas L. Magnanti. "A Comparison of Mixed-Integer Programming Models for Non-Convex Piecewise Linear Cost Minimization Problems." Massachusetts Institute of Technology, Operations Research Center, 2002. http://hdl.handle.net/1721.1/5233.
Full textHe, Niao. "Saddle point techniques in convex composite and error-in-measurement optimization." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/54400.
Full textGhebremariam, Samuel. "Energy Production Cost and PAR Minimization in Multi-Source Power Networks." University of Akron / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=akron1336517757.
Full textSinha, Arunesh. "Audit Games." Research Showcase @ CMU, 2014. http://repository.cmu.edu/dissertations/487.
Full textCaillaud, Corentin. "Asymptotical estimates for some algorithms for data and image processing : a study of the Sinkhorn algorithm and a numerical analysis of total variation minimization." Thesis, Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAX023.
Full textThis thesis deals with discrete optimization problems and investigates estimates of their convergence rates. It is divided into two independent parts.The first part addresses the convergence rate of the Sinkhorn algorithm and of some of its variants. This algorithm appears in the context of Optimal Transportation (OT) through entropic regularization. Its iterations, and the ones of the Sinkhorn-like variants, are written as componentwise products of nonnegative vectors and matrices. We propose a new approach to analyze them, based on simple convex inequalities and leading to the linear convergence rate that is observed in practice. We extend this result to a particular type of variants of the algorithm that we call 1D balanced Sinkhorn-like algorithms. In addition, we present some numerical techniques dealing with the convergence towards zero of the regularizing parameter of the OT problems. Lastly, we conduct the complete analysis of the convergence rate in dimension 2. In the second part, we establish error estimates for two discretizations of the total variation (TV) in the Rudin-Osher-Fatemi (ROF) model. This image denoising problem, that is solved by computing the proximal operator of the total variation, enjoys isotropy properties ensuring the preservation of sharp discontinuities in the denoised images in every direction. When the problem is discretized into a square mesh of size h and one uses a standard discrete total variation -- the so-called isotropic TV -- this property is lost. We show that in a particular direction the error in the energy is of order h^{2/3} which is relatively large with respect to what one can expect with better discretizations. Our proof relies on the analysis of an equivalent 1D denoising problem and of the perturbed TV it involves. The second discrete total variation we consider mimics the definition of the continuous total variation replacing the usual dual fields by discrete Raviart-Thomas fields. Doing so, we recover an isotropic behavior of the discrete ROF model. Finally, we prove a O(h) error estimate for this variant under standard hypotheses
Books on the topic "Convex minimization"
Hiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. Berlin: Springer-Verlag, 1993.
Find full textHiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. Berlin: Springer-Verlag, 1993.
Find full textHiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. 2nd ed. Berlin: Springer-Verlag, 1996.
Find full textHiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. Berlin: Springer-Verlag, 1993.
Find full textHiriart-Urruty, Jean-Baptiste, and Claude Lemaréchal. Convex Analysis and Minimization Algorithms I. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-662-02796-7.
Full textHiriart-Urruty, Jean-Baptiste, and Claude Lemaréchal. Convex Analysis and Minimization Algorithms II. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-662-06409-2.
Full textLemarechal, Claude, and Jean-Baptiste Hiriart-Urruty. Convex Analysis and Minimization Algorithms I: Fundamentals. Springer, 2011.
Find full textLemarechal, Claude, and Jean-Baptiste Hiriart-Urruty. Convex Analysis and Minimization Algorithms I: Fundamentals (Grundlehren der mathematischen Wissenschaften Book 305). Springer, 2011.
Find full textLemarechal, Claude, and Jean-Baptiste Hiriart-Urruty. Convex Analysis and Minimization Algorithms: Part 2: Advanced Theory and Bundle Methods (Grundlehren der mathematischen Wissenschaften). Springer, 2001.
Find full textBook chapters on the topic "Convex minimization"
Lange, Kenneth. "Convex Minimization Algorithms." In Springer Texts in Statistics, 415–44. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-5838-8_16.
Full textBauschke, Heinz H., and Patrick L. Combettes. "Convex Minimization Problems." In CMS Books in Mathematics, 189–201. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-48311-5_11.
Full textTroutman, John L. "Minimization of Convex Functions." In Variational Calculus and Optimal Control, 53–96. New York, NY: Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-0737-5_4.
Full textJ. Zaslavski, Alexander. "Minimization of Quasiconvex Functions." In Convex Optimization with Computational Errors, 287–93. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-37822-6_10.
Full textJ. Zaslavski, Alexander. "Minimization of Sharp Weakly Convex Functions." In Convex Optimization with Computational Errors, 295–320. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-37822-6_11.
Full textFalb, Peter. "Minimization of Functionals: Uniformly Convex Spaces." In Direct Methods in Control Problems, 31–39. New York, NY: Springer New York, 2019. http://dx.doi.org/10.1007/978-0-8176-4723-0_5.
Full textLi, S. Z., Y. H. Huang, J. S. Fu, and K. L. Chan. "Edge-preserving smoothing by convex minimization." In Computer Vision — ACCV'98, 746–53. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997. http://dx.doi.org/10.1007/3-540-63930-6_190.
Full textSouza de Cursi, Eduardo, Rubens Sampaio, and Piotr Breitkopf. "Minimization of a Non-Convex Function." In Modeling and Convexity, 61–68. Hoboken, NJ USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118622438.ch4.
Full textKiwiel, K. C. "Descent Methods for Nonsmooth Convex Constrained Minimization." In Nondifferentiable Optimization: Motivations and Applications, 203–14. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-662-12603-5_19.
Full textTuzikov, Alexander V., and Stanislav A. Sheynin. "Minkowski Sum Volume Minimization for Convex Polyhedra." In Mathematical Morphology and its Applications to Image and Signal Processing, 33–40. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/0-306-47025-x_5.
Full textConference papers on the topic "Convex minimization"
Dvijotham, Krishnamurthy, Emanuel Todorov, and Maryam Fazel. "Convex control design via covariance minimization." In 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton). IEEE, 2013. http://dx.doi.org/10.1109/allerton.2013.6736510.
Full textMargellos, Kostas, Alessandro Falsone, Simone Garatti, and Maria Prandini. "Proximal minimization based distributed convex optimization." In 2016 American Control Conference (ACC). IEEE, 2016. http://dx.doi.org/10.1109/acc.2016.7525287.
Full textSouiai, Mohamed, Martin R. Oswald, Youngwook Kee, Junmo Kim, Marc Pollefeys, and Daniel Cremers. "Entropy Minimization for Convex Relaxation Approaches." In 2015 IEEE International Conference on Computer Vision (ICCV). IEEE, 2015. http://dx.doi.org/10.1109/iccv.2015.207.
Full textTran-Dinh, Quoc, Yen-Huan Li, and Volkan Cevher. "Barrier smoothing for nonsmooth convex minimization." In ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2014. http://dx.doi.org/10.1109/icassp.2014.6853848.
Full textRothvoss, Thomas. "Constructive Discrepancy Minimization for Convex Sets." In 2014 IEEE 55th Annual Symposium on Foundations of Computer Science (FOCS). IEEE, 2014. http://dx.doi.org/10.1109/focs.2014.23.
Full textCombettes, Patrick L., and Jean-Christophe Pesquet. "Split convex minimization algorithm for signal recovery." In ICASSP 2009 - 2009 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2009. http://dx.doi.org/10.1109/icassp.2009.4959676.
Full textSlavakis, Konstantinos. "Stochastic Composite Convex Minimization with Affine Constraints." In 2018 52nd Asilomar Conference on Signals, Systems, and Computers. IEEE, 2018. http://dx.doi.org/10.1109/acssc.2018.8645298.
Full textZhang, Hu, Pan Zhou, Yi Yang, and Jiashi Feng. "Generalized Majorization-Minimization for Non-Convex Optimization." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/591.
Full textBliek, Laurens, Michel Verhaegen, and Sander Wahls. "Online function minimization with convex random relu expansions." In 2017 IEEE 27th International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2017. http://dx.doi.org/10.1109/mlsp.2017.8168109.
Full textYuan, Gonglin, Zengxin Wei, and Guangjun Zhu. "A spectral gradient algorithm for nonsmooth convex minimization." In 2012 4th Electronic System-Integration Technology Conference (ESTC). IEEE, 2012. http://dx.doi.org/10.1109/estc.2012.6485724.
Full textReports on the topic "Convex minimization"
Giles, Daniel. The Majorization Minimization Principle and Some Applications in Convex Optimization. Portland State University Library, January 2015. http://dx.doi.org/10.15760/honors.175.
Full text