Literatura académica sobre el tema "Convex minimization"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Convex minimization".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Convex minimization"
Li, Duan, Zhi-You Wu, Heung-Wing Joseph Lee, Xin-Min Yang y Lian-Sheng Zhang. "Hidden Convex Minimization". Journal of Global Optimization 31, n.º 2 (febrero de 2005): 211–33. http://dx.doi.org/10.1007/s10898-004-5697-5.
Texto completoMayeli, Azita. "Non-convex Optimization via Strongly Convex Majorization-minimization". Canadian Mathematical Bulletin 63, n.º 4 (10 de diciembre de 2019): 726–37. http://dx.doi.org/10.4153/s0008439519000730.
Texto completoScarpa, Luca y Ulisse Stefanelli. "Stochastic PDEs via convex minimization". Communications in Partial Differential Equations 46, n.º 1 (14 de octubre de 2020): 66–97. http://dx.doi.org/10.1080/03605302.2020.1831017.
Texto completoThach, P. T. "Convex minimization under Lipschitz constraints". Journal of Optimization Theory and Applications 64, n.º 3 (marzo de 1990): 595–614. http://dx.doi.org/10.1007/bf00939426.
Texto completoMifflin, Robert y Claudia Sagastizábal. "A -algorithm for convex minimization". Mathematical Programming 104, n.º 2-3 (14 de julio de 2005): 583–608. http://dx.doi.org/10.1007/s10107-005-0630-3.
Texto completoShioura, Akiyoshi. "Minimization of an M-convex function". Discrete Applied Mathematics 84, n.º 1-3 (mayo de 1998): 215–20. http://dx.doi.org/10.1016/s0166-218x(97)00140-6.
Texto completoO'Hara, John G., Paranjothi Pillay y Hong-Kun Xu. "Iterative Approaches to Convex Minimization Problems". Numerical Functional Analysis and Optimization 25, n.º 5-6 (enero de 2004): 531–46. http://dx.doi.org/10.1081/nfa-200041707.
Texto completoYe, Qiaolin, Chunxia Zhao, Ning Ye y Xiaobo Chen. "Localized twin SVM via convex minimization". Neurocomputing 74, n.º 4 (enero de 2011): 580–87. http://dx.doi.org/10.1016/j.neucom.2010.09.015.
Texto completoAkagi, Goro y Ulisse Stefanelli. "Doubly Nonlinear Equations as Convex Minimization". SIAM Journal on Mathematical Analysis 46, n.º 3 (enero de 2014): 1922–45. http://dx.doi.org/10.1137/13091909x.
Texto completoStefanov, Stefan M. "Convex separable minimization with box constraints". PAMM 7, n.º 1 (diciembre de 2007): 2060045–46. http://dx.doi.org/10.1002/pamm.200700535.
Texto completoTesis sobre el tema "Convex minimization"
NediÄ, Angelia. "Subgradient methods for convex minimization". Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/16843.
Texto completoIncludes bibliographical references (p. 169-174).
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Many optimization problems arising in various applications require minimization of an objective cost function that is convex but not differentiable. Such a minimization arises, for example, in model construction, system identification, neural networks, pattern classification, and various assignment, scheduling, and allocation problems. To solve convex but not differentiable problems, we have to employ special methods that can work in the absence of differentiability, while taking the advantage of convexity and possibly other special structures that our minimization problem may possess. In this thesis, we propose and analyze some new methods that can solve convex (not necessarily differentiable) problems. In particular, we consider two classes of methods: incremental and variable metric.
by Angelia NediÄ.
Ph.D.
Apidopoulos, Vasileios. "Inertial Gradient-Descent algorithms for convex minimization". Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0175/document.
Texto completoThis Thesis focuses on the study of inertial methods for solving composite convex minimization problems. Since the early works of Polyak and Nesterov, inertial methods become very popular, thanks to their acceleration effects. Here, we study a family of Nesterov-type inertial proximalgradient algorithms with a particular over-relaxation sequence. We give a unified presentation about the different convergence properties of this family of algorithms, depending on the over-relaxation parameter. In addition we addressing this issue, in the case of a smooth function with additional geometrical structure, such as the growth (or Łojasiewicz) condition. We show that by combining growth condition and a flatness-type condition on the geometry of the minimizing function, we are able to obtain some new convergence rates. Our analysis follows a continuous-to-discrete trail, passing from continuous-on time-dynamical systems to discrete schemes. In particular the family of inertial algorithms that interest us, can be identified as a finite difference scheme of a differential equation/inclusion. This approach provides a useful guideline, which permits to transpose the different results and their proofs from the continuous system to the discrete one. This opens the way for new possible inertial schemes, derived by the same dynamical system
Gräser, Carsten [Verfasser]. "Convex minimization and phase field models / Carsten Gräser". Berlin : Freie Universität Berlin, 2011. http://d-nb.info/1026174848/34.
Texto completoEl, Gheche Mireille. "Proximal methods for convex minimization of Phi-divergences : application to computer vision". Thesis, Paris Est, 2014. http://www.theses.fr/2014PEST1018/document.
Texto completoConvex optimization aims at searching for the minimum of a convex function over a convex set. While the theory of convex optimization has been largely explored for about a century, several related developments have stimulated a new interest in the topic. The first one is the emergence of efficient optimization algorithms, such as proximal methods, which allow one to easily solve large-size nonsmooth convex problems in a parallel manner. The second development is the discovery of the fact that convex optimization problems are more ubiquitous in practice than was thought previously. In this thesis, we address two different problems within the framework of convex optimization. The first one is an application to computer stereo vision, where the goal is to recover the depth information of a scene from a pair of images taken from the left and right positions. The second one is the proposition of new mathematical tools to deal with convex optimization problems involving information measures, where the objective is to minimize the divergence between two statistical objects such as random variables or probability distributions. We propose a convex approach to address the problem of dense disparity estimation under varying illumination conditions. A convex energy function is derived for jointly estimating the disparity and the illumination variation. The resulting problem is tackled in a set theoretic framework and solved using proximal tools. It is worth emphasizing the ability of this method to process multicomponent images under illumination variation. The conducted experiments indicate that this approach can effectively deal with the local illumination changes and yields better results compared with existing methods. We then extend the previous approach to the problem of multi-view disparity estimation. Rather than estimating a single depth map, we estimate a sequence of disparity maps, one for each input image. We address this problem by adopting a discrete reformulation that can be efficiently solved through a convex relaxation. This approach offers the advantage of handling both convex and nonconvex similarity measures within the same framework. We have shown that the additional complexity required by the application of our method to the multi-view case is small with respect to the stereo case. Finally, we have proposed a novel approach to handle a broad class of statistical distances, called $varphi$-divergences, within the framework of proximal algorithms. In particular, we have developed the expression of the proximity operators of several $varphi$-divergences, such as Kulback-Leibler, Jeffrey-Kulback, Hellinger, Chi-Square, I$_{alpha}$, and Renyi divergences. This allows proximal algorithms to deal with problems involving such divergences, thus overcoming the limitations of current state-of-the-art approaches for similar problems. The proposed approach is validated in two different contexts. The first is an application to image restoration that illustrates how to employ divergences as a regularization term, while the second is an application to image registration that employs divergences as a data fidelity term
Doto, James William. "Conditional uniform convexity in Orlicz spaces and minimization problems". Thesis, Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/27352.
Texto completoCroxton, Keely L., Bernard Gendon y Thomas L. Magnanti. "A Comparison of Mixed-Integer Programming Models for Non-Convex Piecewise Linear Cost Minimization Problems". Massachusetts Institute of Technology, Operations Research Center, 2002. http://hdl.handle.net/1721.1/5233.
Texto completoHe, Niao. "Saddle point techniques in convex composite and error-in-measurement optimization". Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/54400.
Texto completoGhebremariam, Samuel. "Energy Production Cost and PAR Minimization in Multi-Source Power Networks". University of Akron / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=akron1336517757.
Texto completoSinha, Arunesh. "Audit Games". Research Showcase @ CMU, 2014. http://repository.cmu.edu/dissertations/487.
Texto completoCaillaud, Corentin. "Asymptotical estimates for some algorithms for data and image processing : a study of the Sinkhorn algorithm and a numerical analysis of total variation minimization". Thesis, Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAX023.
Texto completoThis thesis deals with discrete optimization problems and investigates estimates of their convergence rates. It is divided into two independent parts.The first part addresses the convergence rate of the Sinkhorn algorithm and of some of its variants. This algorithm appears in the context of Optimal Transportation (OT) through entropic regularization. Its iterations, and the ones of the Sinkhorn-like variants, are written as componentwise products of nonnegative vectors and matrices. We propose a new approach to analyze them, based on simple convex inequalities and leading to the linear convergence rate that is observed in practice. We extend this result to a particular type of variants of the algorithm that we call 1D balanced Sinkhorn-like algorithms. In addition, we present some numerical techniques dealing with the convergence towards zero of the regularizing parameter of the OT problems. Lastly, we conduct the complete analysis of the convergence rate in dimension 2. In the second part, we establish error estimates for two discretizations of the total variation (TV) in the Rudin-Osher-Fatemi (ROF) model. This image denoising problem, that is solved by computing the proximal operator of the total variation, enjoys isotropy properties ensuring the preservation of sharp discontinuities in the denoised images in every direction. When the problem is discretized into a square mesh of size h and one uses a standard discrete total variation -- the so-called isotropic TV -- this property is lost. We show that in a particular direction the error in the energy is of order h^{2/3} which is relatively large with respect to what one can expect with better discretizations. Our proof relies on the analysis of an equivalent 1D denoising problem and of the perturbed TV it involves. The second discrete total variation we consider mimics the definition of the continuous total variation replacing the usual dual fields by discrete Raviart-Thomas fields. Doing so, we recover an isotropic behavior of the discrete ROF model. Finally, we prove a O(h) error estimate for this variant under standard hypotheses
Libros sobre el tema "Convex minimization"
Hiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. Berlin: Springer-Verlag, 1993.
Buscar texto completoHiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. Berlin: Springer-Verlag, 1993.
Buscar texto completoHiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. 2a ed. Berlin: Springer-Verlag, 1996.
Buscar texto completoHiriart-Urruty, Jean-Baptiste. Convex analysis and minimization algorithms. Berlin: Springer-Verlag, 1993.
Buscar texto completoHiriart-Urruty, Jean-Baptiste y Claude Lemaréchal. Convex Analysis and Minimization Algorithms I. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-662-02796-7.
Texto completoHiriart-Urruty, Jean-Baptiste y Claude Lemaréchal. Convex Analysis and Minimization Algorithms II. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-662-06409-2.
Texto completoLemarechal, Claude y Jean-Baptiste Hiriart-Urruty. Convex Analysis and Minimization Algorithms I: Fundamentals. Springer, 2011.
Buscar texto completoLemarechal, Claude y Jean-Baptiste Hiriart-Urruty. Convex Analysis and Minimization Algorithms I: Fundamentals (Grundlehren der mathematischen Wissenschaften Book 305). Springer, 2011.
Buscar texto completoLemarechal, Claude y Jean-Baptiste Hiriart-Urruty. Convex Analysis and Minimization Algorithms: Part 2: Advanced Theory and Bundle Methods (Grundlehren der mathematischen Wissenschaften). Springer, 2001.
Buscar texto completoCapítulos de libros sobre el tema "Convex minimization"
Lange, Kenneth. "Convex Minimization Algorithms". En Springer Texts in Statistics, 415–44. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-5838-8_16.
Texto completoBauschke, Heinz H. y Patrick L. Combettes. "Convex Minimization Problems". En CMS Books in Mathematics, 189–201. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-48311-5_11.
Texto completoTroutman, John L. "Minimization of Convex Functions". En Variational Calculus and Optimal Control, 53–96. New York, NY: Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-0737-5_4.
Texto completoJ. Zaslavski, Alexander. "Minimization of Quasiconvex Functions". En Convex Optimization with Computational Errors, 287–93. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-37822-6_10.
Texto completoJ. Zaslavski, Alexander. "Minimization of Sharp Weakly Convex Functions". En Convex Optimization with Computational Errors, 295–320. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-37822-6_11.
Texto completoFalb, Peter. "Minimization of Functionals: Uniformly Convex Spaces". En Direct Methods in Control Problems, 31–39. New York, NY: Springer New York, 2019. http://dx.doi.org/10.1007/978-0-8176-4723-0_5.
Texto completoLi, S. Z., Y. H. Huang, J. S. Fu y K. L. Chan. "Edge-preserving smoothing by convex minimization". En Computer Vision — ACCV'98, 746–53. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997. http://dx.doi.org/10.1007/3-540-63930-6_190.
Texto completoSouza de Cursi, Eduardo, Rubens Sampaio y Piotr Breitkopf. "Minimization of a Non-Convex Function". En Modeling and Convexity, 61–68. Hoboken, NJ USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118622438.ch4.
Texto completoKiwiel, K. C. "Descent Methods for Nonsmooth Convex Constrained Minimization". En Nondifferentiable Optimization: Motivations and Applications, 203–14. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-662-12603-5_19.
Texto completoTuzikov, Alexander V. y Stanislav A. Sheynin. "Minkowski Sum Volume Minimization for Convex Polyhedra". En Mathematical Morphology and its Applications to Image and Signal Processing, 33–40. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/0-306-47025-x_5.
Texto completoActas de conferencias sobre el tema "Convex minimization"
Dvijotham, Krishnamurthy, Emanuel Todorov y Maryam Fazel. "Convex control design via covariance minimization". En 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton). IEEE, 2013. http://dx.doi.org/10.1109/allerton.2013.6736510.
Texto completoMargellos, Kostas, Alessandro Falsone, Simone Garatti y Maria Prandini. "Proximal minimization based distributed convex optimization". En 2016 American Control Conference (ACC). IEEE, 2016. http://dx.doi.org/10.1109/acc.2016.7525287.
Texto completoSouiai, Mohamed, Martin R. Oswald, Youngwook Kee, Junmo Kim, Marc Pollefeys y Daniel Cremers. "Entropy Minimization for Convex Relaxation Approaches". En 2015 IEEE International Conference on Computer Vision (ICCV). IEEE, 2015. http://dx.doi.org/10.1109/iccv.2015.207.
Texto completoTran-Dinh, Quoc, Yen-Huan Li y Volkan Cevher. "Barrier smoothing for nonsmooth convex minimization". En ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2014. http://dx.doi.org/10.1109/icassp.2014.6853848.
Texto completoRothvoss, Thomas. "Constructive Discrepancy Minimization for Convex Sets". En 2014 IEEE 55th Annual Symposium on Foundations of Computer Science (FOCS). IEEE, 2014. http://dx.doi.org/10.1109/focs.2014.23.
Texto completoCombettes, Patrick L. y Jean-Christophe Pesquet. "Split convex minimization algorithm for signal recovery". En ICASSP 2009 - 2009 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2009. http://dx.doi.org/10.1109/icassp.2009.4959676.
Texto completoSlavakis, Konstantinos. "Stochastic Composite Convex Minimization with Affine Constraints". En 2018 52nd Asilomar Conference on Signals, Systems, and Computers. IEEE, 2018. http://dx.doi.org/10.1109/acssc.2018.8645298.
Texto completoZhang, Hu, Pan Zhou, Yi Yang y Jiashi Feng. "Generalized Majorization-Minimization for Non-Convex Optimization". En Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/591.
Texto completoBliek, Laurens, Michel Verhaegen y Sander Wahls. "Online function minimization with convex random relu expansions". En 2017 IEEE 27th International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2017. http://dx.doi.org/10.1109/mlsp.2017.8168109.
Texto completoYuan, Gonglin, Zengxin Wei y Guangjun Zhu. "A spectral gradient algorithm for nonsmooth convex minimization". En 2012 4th Electronic System-Integration Technology Conference (ESTC). IEEE, 2012. http://dx.doi.org/10.1109/estc.2012.6485724.
Texto completoInformes sobre el tema "Convex minimization"
Giles, Daniel. The Majorization Minimization Principle and Some Applications in Convex Optimization. Portland State University Library, enero de 2015. http://dx.doi.org/10.15760/honors.175.
Texto completo