Academic literature on the topic 'Sinkhorn's algorithm'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sinkhorn's algorithm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Sinkhorn's algorithm"

1

Yuille, A. L., and Anand Rangarajan. "The Concave-Convex Procedure." Neural Computation 15, no. 4 (April 1, 2003): 915–36. http://dx.doi.org/10.1162/08997660360581958.

Full text
Abstract:
The concave-convex procedure (CCCP) is a way to construct discrete-time iterative dynamical systems that are guaranteed to decrease global optimization and energy functions monotonically. This procedure can be applied to almost any optimization problem, and many existing algorithms can be interpreted in terms of it. In particular, we prove that all expectation-maximization algorithms and classes of Legendre minimization and variational bounding algorithms can be reexpressed in terms of CCCP. We show that many existing neural network and mean-field theory algorithms are also examples of CCCP. The generalized iterative scaling algorithm and Sinkhorn's algorithm can also be expressed as CCCP by changing variables. CCCP can be used both as a new way to understand, and prove the convergence of, existing optimization algorithms and as a procedure for generating new algorithms.
APA, Harvard, Vancouver, ISO, and other styles
2

Thibault, Alexis, Lénaïc Chizat, Charles Dossal, and Nicolas Papadakis. "Overrelaxed Sinkhorn–Knopp Algorithm for Regularized Optimal Transport." Algorithms 14, no. 5 (April 30, 2021): 143. http://dx.doi.org/10.3390/a14050143.

Full text
Abstract:
This article describes a set of methods for quickly computing the solution to the regularized optimal transport problem. It generalizes and improves upon the widely used iterative Bregman projections algorithm (or Sinkhorn–Knopp algorithm). We first proposed to rely on regularized nonlinear acceleration schemes. In practice, such approaches lead to fast algorithms, but their global convergence is not ensured. Hence, we next proposed a new algorithm with convergence guarantees. The idea is to overrelax the Bregman projection operators, allowing for faster convergence. We proposed a simple method for establishing global convergence by ensuring the decrease of a Lyapunov function at each step. An adaptive choice of the overrelaxation parameter based on the Lyapunov function was constructed. We also suggested a heuristic to choose a suitable asymptotic overrelaxation parameter, based on a local convergence analysis. Our numerical experiments showed a gain in convergence speed by an order of magnitude in certain regimes.
APA, Harvard, Vancouver, ISO, and other styles
3

Kushinsky, Yam, Haggai Maron, Nadav Dym, and Yaron Lipman. "Sinkhorn Algorithm for Lifted Assignment Problems." SIAM Journal on Imaging Sciences 12, no. 2 (January 2019): 716–35. http://dx.doi.org/10.1137/18m1196480.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

He, Chu, Qingyi Zhang, Tao Qu, Dingwen Wang, and Mingsheng Liao. "Remote Sensing and Texture Image Classification Network Based on Deep Learning Integrated with Binary Coding and Sinkhorn Distance." Remote Sensing 11, no. 23 (December 3, 2019): 2870. http://dx.doi.org/10.3390/rs11232870.

Full text
Abstract:
In the past two decades, traditional hand-crafted feature based methods and deep feature based methods have successively played the most important role in image classification. In some cases, hand-crafted features still provide better performance than deep features. This paper proposes an innovative network based on deep learning integrated with binary coding and Sinkhorn distance (DBSNet) for remote sensing and texture image classification. The statistical texture features of the image extracted by uniform local binary pattern (ULBP) are introduced as a supplement for deep features extracted by ResNet-50 to enhance the discriminability of features. After the feature fusion, both diversity and redundancy of the features have increased, thus we propose the Sinkhorn loss where an entropy regularization term plays a key role in removing redundant information and training the model quickly and efficiently. Image classification experiments are performed on two texture datasets and five remote sensing datasets. The results show that the statistical texture features of the image extracted by ULBP complement the deep features, and the new Sinkhorn loss performs better than the commonly used softmax loss. The performance of the proposed algorithm DBSNet ranks in the top three on the remote sensing datasets compared with other state-of-the-art algorithms.
APA, Harvard, Vancouver, ISO, and other styles
5

Knight, Philip A. "The Sinkhorn–Knopp Algorithm: Convergence and Applications." SIAM Journal on Matrix Analysis and Applications 30, no. 1 (January 2008): 261–75. http://dx.doi.org/10.1137/060659624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

PEYRÉ, GABRIEL, LÉNAÏC CHIZAT, FRANÇOIS-XAVIER VIALARD, and JUSTIN SOLOMON. "Quantum entropic regularization of matrix-valued optimal transport." European Journal of Applied Mathematics 30, no. 6 (September 28, 2017): 1079–102. http://dx.doi.org/10.1017/s0956792517000274.

Full text
Abstract:
This article introduces a new notion of optimal transport (OT) between tensor fields, which are measures whose values are positive semidefinite (PSD) matrices. This “quantum” formulation of optimal transport (Q-OT) corresponds to a relaxed version of the classical Kantorovich transport problem, where the fidelity between the input PSD-valued measures is captured using the geometry of the Von-Neumann quantum entropy. We propose a quantum-entropic regularization of the resulting convex optimization problem, which can be solved efficiently using an iterative scaling algorithm. This method is a generalization of the celebrated Sinkhorn algorithm to the quantum setting of PSD matrices. We extend this formulation and the quantum Sinkhorn algorithm to compute barycentres within a collection of input tensor fields. We illustrate the usefulness of the proposed approach on applications to procedural noise generation, anisotropic meshing, diffusion tensor imaging and spectral texture synthesis.
APA, Harvard, Vancouver, ISO, and other styles
7

Benamou, Jean-David, Guillaume Carlier, Simone Di Marino, and Luca Nenna. "An entropy minimization approach to second-order variational mean-field games." Mathematical Models and Methods in Applied Sciences 29, no. 08 (July 2019): 1553–83. http://dx.doi.org/10.1142/s0218202519500283.

Full text
Abstract:
We propose an entropy minimization viewpoint on variational mean-field games with diffusion and quadratic Hamiltonian. We carefully analyze the time discretization of such problems, establish [Formula: see text]-convergence results as the time step vanishes and propose an efficient algorithm relying on this entropic interpretation as well as on the Sinkhorn scaling algorithm.
APA, Harvard, Vancouver, ISO, and other styles
8

Benamou, Jean-David, Guillaume Carlier, and Luca Nenna. "Generalized incompressible flows, multi-marginal transport and Sinkhorn algorithm." Numerische Mathematik 142, no. 1 (September 25, 2018): 33–54. http://dx.doi.org/10.1007/s00211-018-0995-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tardy, Benjamin, Jordi Inglada, and Julien Michel. "Assessment of Optimal Transport for Operational Land-Cover Mapping Using High-Resolution Satellite Images Time Series without Reference Data of the Mapping Period." Remote Sensing 11, no. 9 (May 3, 2019): 1047. http://dx.doi.org/10.3390/rs11091047.

Full text
Abstract:
Land-cover map production using remote-sensing imagery is governed by data availability. In our case, data sources are two-fold: on one hand, optical data provided regularly by satellites such as Sentinel-2, and on the other hand, reference data which allow calibrating mapping methods or validating the results. The lengthy delays due to reference data collection and cleansing are one of the main issues for applications. In this work, the use of Optimal Transport (OT) is proposed. OT is a Domain Adaptation method that uses past data, both images and reference data, to produce the land-cover map of the current period without updated reference data. Seven years of Formosat-2 image time series and the corresponding reference data are used to evaluate two OT algorithms: conventional EMD transport and regularized transport based on the Sinkhorn distance. The contribution of OT to a classification fusion strategy is also evaluated. The results show that with a 17-class nomenclature the problem is too complex for the Sinkhorn algorithm, which provides maps with an Overall Accuracy (OA) of 30%. In contrast, with the EMD algorithm, an OA close to 70% is obtained. One limitation of OT is the number of classes that can be considered at the same time. Simplification schemes are proposed to reduce the number of classes to be transported. Cases of improvement are shown when the problem is simplified, with an improvement in OA varying from 5% and 20%, producing maps with an OA near 79%. As several years are available, the OT approaches are compared to standard fusion schemes, like majority voting. The gain in voting strategies with OT use is lower than the gain obtained with standard majority voting (around 5%).
APA, Harvard, Vancouver, ISO, and other styles
10

Berman, Robert J. "The Sinkhorn algorithm, parabolic optimal transport and geometric Monge–Ampère equations." Numerische Mathematik 145, no. 4 (June 27, 2020): 771–836. http://dx.doi.org/10.1007/s00211-020-01127-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Sinkhorn's algorithm"

1

Chizat, Lénaïc. "Transport optimal de mesures positives : modèles, méthodes numériques, applications." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLED063/document.

Full text
Abstract:
L'objet de cette thèse est d'étendre le cadre théorique et les méthodes numériques du transport optimal à des objets plus généraux que des mesures de probabilité. En premier lieu, nous définissons des modèles de transport optimal entre mesures positives suivant deux approches, interpolation et couplage de mesures, dont nous montrons l'équivalence. De ces modèles découle une généralisation des métriques de Wasserstein. Dans une seconde partie, nous développons des méthodes numériques pour résoudre les deux formulations et étudions en particulier une nouvelle famille d'algorithmes de "scaling", s'appliquant à une grande variété de problèmes. La troisième partie contient des illustrations ainsi que l'étude théorique et numérique, d'un flot de gradient de type Hele-Shaw dans l'espace des mesures. Pour les mesures à valeurs matricielles, nous proposons aussi un modèle de transport optimal qui permet un bon arbitrage entre fidélité géométrique et efficacité algorithmique
This thesis generalizes optimal transport beyond the classical "balanced" setting of probability distributions. We define unbalanced optimal transport models between nonnegative measures, based either on the notion of interpolation or the notion of coupling of measures. We show relationships between these approaches. One of the outcomes of this framework is a generalization of the p-Wasserstein metrics. Secondly, we build numerical methods to solve interpolation and coupling-based models. We study, in particular, a new family of scaling algorithms that generalize Sinkhorn's algorithm. The third part deals with applications. It contains a theoretical and numerical study of a Hele-Shaw type gradient flow in the space of nonnegative measures. It also adresses the case of measures taking values in the cone of positive semi-definite matrices, for which we introduce a model that achieves a balance between geometrical accuracy and algorithmic efficiency
APA, Harvard, Vancouver, ISO, and other styles
2

Caillaud, Corentin. "Asymptotical estimates for some algorithms for data and image processing : a study of the Sinkhorn algorithm and a numerical analysis of total variation minimization." Thesis, Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAX023.

Full text
Abstract:
Cette thèse traite de problèmes discrets d'optimisation convexe et s'intéresse à des estimations de leurs taux de convergence. Elle s'organise en deux parties indépendantes.Dans la première partie, nous étudions le taux de convergence de l'algorithme de Sinkhorn et de certaines de ses variantes. Cet algorithme apparaît dans le cadre du Transport Optimal (TO) par l'intermédiaire d'une régularisation entropique. Ses itérations, comme celles de ses variantes, s'écrivent sous la forme de produits composante par composante de matrices et de vecteurs positifs. Pour les étudier, nous proposons une nouvelle approche basée sur des inégalités de convexité simples et menant au taux de convergence linéaire observé en pratique. Nous étendons ce résultat à un certain type de variantes de l'algorithme que nous appelons algorithmes de Sinkhorn équilibrés de dimension 1. Nous présentons ensuite des techniques numériques traitant le cas de la convergence vers zéro du paramètre de régularisation des problèmes de TO. Enfin, nous menons l'analyse complète du taux de convergence en dimension 2.Dans la deuxième partie, nous donnons des estimations d'erreur pour deux discrétisations de la variation totale (TV) dans le modèle de Rudin, Osher et Fatemi (ROF). Ce problème de débruitage d'image, qui revient à calculer l'opérateur proximal de la variation totale, bénéficie de propriétés d'isotropie assurant la conservation de discontinuités nettes dans les images débruitées, et ce dans toutes les directions. En discrétisant le problème sur un maillage carré de taille h et en utilisant une variation totale discrète standard dite TV isotrope, cette propriété est perdue. Nous démontrons que dans une direction particulière l'erreur sur l'énergie est d'ordre h^{2/3}, ce qui est relativement élevé face aux attentes pour de meilleures discrétisations. Notre preuve repose sur l'analyse d'un problème équivalent en dimension 1 et de la TV perturbée qui y intervient. La deuxième variation totale discrète que nous considérons copie la définition de la variation totale continue en remplaçant les champs duaux habituels par des champs discrets dits de Raviart-Thomas. Nous retrouvons ainsi le caractère isotrope du modèle ROF discret. Pour conclure, nous prouvons, pour cette variation totale et sous certaines hypothèses, une estimation d'erreur en O(h)
This thesis deals with discrete optimization problems and investigates estimates of their convergence rates. It is divided into two independent parts.The first part addresses the convergence rate of the Sinkhorn algorithm and of some of its variants. This algorithm appears in the context of Optimal Transportation (OT) through entropic regularization. Its iterations, and the ones of the Sinkhorn-like variants, are written as componentwise products of nonnegative vectors and matrices. We propose a new approach to analyze them, based on simple convex inequalities and leading to the linear convergence rate that is observed in practice. We extend this result to a particular type of variants of the algorithm that we call 1D balanced Sinkhorn-like algorithms. In addition, we present some numerical techniques dealing with the convergence towards zero of the regularizing parameter of the OT problems. Lastly, we conduct the complete analysis of the convergence rate in dimension 2. In the second part, we establish error estimates for two discretizations of the total variation (TV) in the Rudin-Osher-Fatemi (ROF) model. This image denoising problem, that is solved by computing the proximal operator of the total variation, enjoys isotropy properties ensuring the preservation of sharp discontinuities in the denoised images in every direction. When the problem is discretized into a square mesh of size h and one uses a standard discrete total variation -- the so-called isotropic TV -- this property is lost. We show that in a particular direction the error in the energy is of order h^{2/3} which is relatively large with respect to what one can expect with better discretizations. Our proof relies on the analysis of an equivalent 1D denoising problem and of the perturbed TV it involves. The second discrete total variation we consider mimics the definition of the continuous total variation replacing the usual dual fields by discrete Raviart-Thomas fields. Doing so, we recover an isotropic behavior of the discrete ROF model. Finally, we prove a O(h) error estimate for this variant under standard hypotheses
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Sinkhorn's algorithm"

1

Dvurechensky, Pavel, Alexander Gasnikov, Sergey Omelchenko, and Alexander Tiurin. "A Stable Alternative to Sinkhorn’s Algorithm for Regularized Optimal Transport." In Mathematical Optimization Theory and Operations Research, 406–23. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-49988-4_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Sinkhorn's algorithm"

1

Luo, Lei, Jian Pei, and Heng Huang. "Sinkhorn Regression." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/360.

Full text
Abstract:
This paper introduces a novel Robust Regression (RR) model, named Sinkhorn regression, which imposes Sinkhorn distances on both loss function and regularization. Traditional RR methods target at searching for an element-wise loss function (e.g., Lp-norm) to characterize the errors such that outlying data have a relatively smaller influence on the regression estimator. Due to the neglect of the geometric information, they often lead to the suboptimal results in the practical applications. To address this problem, we use a cross-bin distance function, i.e., Sinkhorn distances, to capture the geometric knowledge of real data. Sinkhorn distances is invariant in movement, rotation and zoom. Thus, our method is more robust to variations of data than traditional regression models. Meanwhile, we leverage Kullback-Leibler divergence to relax the proposed model with marginal constraints into its unbalanced formulation to adapt more types of features. In addition, we propose an efficient algorithm to solve the relaxed model and establish its complete statistical guarantees under mild conditions. Experiments on the five publicly available microarray data sets and one mass spectrometry data set demonstrate the effectiveness and robustness of our method.
APA, Harvard, Vancouver, ISO, and other styles
2

Tachibana, Hideyuki. "Towards Listening to 10 People Simultaneously: An Efficient Permutation Invariant Training of Audio Source Separation Using Sinkhorn’s Algorithm." In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9414508.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography