To see the other types of publications on this topic, follow the link: Sinkhorn's algorithm.

Journal articles on the topic 'Sinkhorn's algorithm'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 17 journal articles for your research on the topic 'Sinkhorn's algorithm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yuille, A. L., and Anand Rangarajan. "The Concave-Convex Procedure." Neural Computation 15, no. 4 (April 1, 2003): 915–36. http://dx.doi.org/10.1162/08997660360581958.

Full text
Abstract:
The concave-convex procedure (CCCP) is a way to construct discrete-time iterative dynamical systems that are guaranteed to decrease global optimization and energy functions monotonically. This procedure can be applied to almost any optimization problem, and many existing algorithms can be interpreted in terms of it. In particular, we prove that all expectation-maximization algorithms and classes of Legendre minimization and variational bounding algorithms can be reexpressed in terms of CCCP. We show that many existing neural network and mean-field theory algorithms are also examples of CCCP. The generalized iterative scaling algorithm and Sinkhorn's algorithm can also be expressed as CCCP by changing variables. CCCP can be used both as a new way to understand, and prove the convergence of, existing optimization algorithms and as a procedure for generating new algorithms.
APA, Harvard, Vancouver, ISO, and other styles
2

Thibault, Alexis, Lénaïc Chizat, Charles Dossal, and Nicolas Papadakis. "Overrelaxed Sinkhorn–Knopp Algorithm for Regularized Optimal Transport." Algorithms 14, no. 5 (April 30, 2021): 143. http://dx.doi.org/10.3390/a14050143.

Full text
Abstract:
This article describes a set of methods for quickly computing the solution to the regularized optimal transport problem. It generalizes and improves upon the widely used iterative Bregman projections algorithm (or Sinkhorn–Knopp algorithm). We first proposed to rely on regularized nonlinear acceleration schemes. In practice, such approaches lead to fast algorithms, but their global convergence is not ensured. Hence, we next proposed a new algorithm with convergence guarantees. The idea is to overrelax the Bregman projection operators, allowing for faster convergence. We proposed a simple method for establishing global convergence by ensuring the decrease of a Lyapunov function at each step. An adaptive choice of the overrelaxation parameter based on the Lyapunov function was constructed. We also suggested a heuristic to choose a suitable asymptotic overrelaxation parameter, based on a local convergence analysis. Our numerical experiments showed a gain in convergence speed by an order of magnitude in certain regimes.
APA, Harvard, Vancouver, ISO, and other styles
3

Kushinsky, Yam, Haggai Maron, Nadav Dym, and Yaron Lipman. "Sinkhorn Algorithm for Lifted Assignment Problems." SIAM Journal on Imaging Sciences 12, no. 2 (January 2019): 716–35. http://dx.doi.org/10.1137/18m1196480.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

He, Chu, Qingyi Zhang, Tao Qu, Dingwen Wang, and Mingsheng Liao. "Remote Sensing and Texture Image Classification Network Based on Deep Learning Integrated with Binary Coding and Sinkhorn Distance." Remote Sensing 11, no. 23 (December 3, 2019): 2870. http://dx.doi.org/10.3390/rs11232870.

Full text
Abstract:
In the past two decades, traditional hand-crafted feature based methods and deep feature based methods have successively played the most important role in image classification. In some cases, hand-crafted features still provide better performance than deep features. This paper proposes an innovative network based on deep learning integrated with binary coding and Sinkhorn distance (DBSNet) for remote sensing and texture image classification. The statistical texture features of the image extracted by uniform local binary pattern (ULBP) are introduced as a supplement for deep features extracted by ResNet-50 to enhance the discriminability of features. After the feature fusion, both diversity and redundancy of the features have increased, thus we propose the Sinkhorn loss where an entropy regularization term plays a key role in removing redundant information and training the model quickly and efficiently. Image classification experiments are performed on two texture datasets and five remote sensing datasets. The results show that the statistical texture features of the image extracted by ULBP complement the deep features, and the new Sinkhorn loss performs better than the commonly used softmax loss. The performance of the proposed algorithm DBSNet ranks in the top three on the remote sensing datasets compared with other state-of-the-art algorithms.
APA, Harvard, Vancouver, ISO, and other styles
5

Knight, Philip A. "The Sinkhorn–Knopp Algorithm: Convergence and Applications." SIAM Journal on Matrix Analysis and Applications 30, no. 1 (January 2008): 261–75. http://dx.doi.org/10.1137/060659624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

PEYRÉ, GABRIEL, LÉNAÏC CHIZAT, FRANÇOIS-XAVIER VIALARD, and JUSTIN SOLOMON. "Quantum entropic regularization of matrix-valued optimal transport." European Journal of Applied Mathematics 30, no. 6 (September 28, 2017): 1079–102. http://dx.doi.org/10.1017/s0956792517000274.

Full text
Abstract:
This article introduces a new notion of optimal transport (OT) between tensor fields, which are measures whose values are positive semidefinite (PSD) matrices. This “quantum” formulation of optimal transport (Q-OT) corresponds to a relaxed version of the classical Kantorovich transport problem, where the fidelity between the input PSD-valued measures is captured using the geometry of the Von-Neumann quantum entropy. We propose a quantum-entropic regularization of the resulting convex optimization problem, which can be solved efficiently using an iterative scaling algorithm. This method is a generalization of the celebrated Sinkhorn algorithm to the quantum setting of PSD matrices. We extend this formulation and the quantum Sinkhorn algorithm to compute barycentres within a collection of input tensor fields. We illustrate the usefulness of the proposed approach on applications to procedural noise generation, anisotropic meshing, diffusion tensor imaging and spectral texture synthesis.
APA, Harvard, Vancouver, ISO, and other styles
7

Benamou, Jean-David, Guillaume Carlier, Simone Di Marino, and Luca Nenna. "An entropy minimization approach to second-order variational mean-field games." Mathematical Models and Methods in Applied Sciences 29, no. 08 (July 2019): 1553–83. http://dx.doi.org/10.1142/s0218202519500283.

Full text
Abstract:
We propose an entropy minimization viewpoint on variational mean-field games with diffusion and quadratic Hamiltonian. We carefully analyze the time discretization of such problems, establish [Formula: see text]-convergence results as the time step vanishes and propose an efficient algorithm relying on this entropic interpretation as well as on the Sinkhorn scaling algorithm.
APA, Harvard, Vancouver, ISO, and other styles
8

Benamou, Jean-David, Guillaume Carlier, and Luca Nenna. "Generalized incompressible flows, multi-marginal transport and Sinkhorn algorithm." Numerische Mathematik 142, no. 1 (September 25, 2018): 33–54. http://dx.doi.org/10.1007/s00211-018-0995-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tardy, Benjamin, Jordi Inglada, and Julien Michel. "Assessment of Optimal Transport for Operational Land-Cover Mapping Using High-Resolution Satellite Images Time Series without Reference Data of the Mapping Period." Remote Sensing 11, no. 9 (May 3, 2019): 1047. http://dx.doi.org/10.3390/rs11091047.

Full text
Abstract:
Land-cover map production using remote-sensing imagery is governed by data availability. In our case, data sources are two-fold: on one hand, optical data provided regularly by satellites such as Sentinel-2, and on the other hand, reference data which allow calibrating mapping methods or validating the results. The lengthy delays due to reference data collection and cleansing are one of the main issues for applications. In this work, the use of Optimal Transport (OT) is proposed. OT is a Domain Adaptation method that uses past data, both images and reference data, to produce the land-cover map of the current period without updated reference data. Seven years of Formosat-2 image time series and the corresponding reference data are used to evaluate two OT algorithms: conventional EMD transport and regularized transport based on the Sinkhorn distance. The contribution of OT to a classification fusion strategy is also evaluated. The results show that with a 17-class nomenclature the problem is too complex for the Sinkhorn algorithm, which provides maps with an Overall Accuracy (OA) of 30%. In contrast, with the EMD algorithm, an OA close to 70% is obtained. One limitation of OT is the number of classes that can be considered at the same time. Simplification schemes are proposed to reduce the number of classes to be transported. Cases of improvement are shown when the problem is simplified, with an improvement in OA varying from 5% and 20%, producing maps with an OA near 79%. As several years are available, the OT approaches are compared to standard fusion schemes, like majority voting. The gain in voting strategies with OT use is lower than the gain obtained with standard majority voting (around 5%).
APA, Harvard, Vancouver, ISO, and other styles
10

Berman, Robert J. "The Sinkhorn algorithm, parabolic optimal transport and geometric Monge–Ampère equations." Numerische Mathematik 145, no. 4 (June 27, 2020): 771–836. http://dx.doi.org/10.1007/s00211-020-01127-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Vos, Alexis De, and Stijn De Baerdemacker. "Scaling a Unitary Matrix." Open Systems & Information Dynamics 21, no. 04 (December 2014): 1450013. http://dx.doi.org/10.1142/s1230161214500139.

Full text
Abstract:
The iterative method of Sinkhorn allows, starting from an arbitrary real matrix with non-negative entries, to find a so-called ‘scaled matrix’ which is doubly stochastic, i.e. a matrix with all entries in the interval (0, 1) and with all line sums equal to 1. We conjecture that a similar procedure exists, which allows, starting from an arbitrary unitary matrix, to find a scaled matrix which is unitary and has all line sums equal to 1. The existence of such algorithm guarantees a powerful decomposition of an arbitrary quantum circuit.
APA, Harvard, Vancouver, ISO, and other styles
12

March, Hadrien De, and Pierre Henry-Labordere. "Building Arbitrage-Free Implied Volatility: Sinkhorn's Algorithm and Variants." SSRN Electronic Journal, 2019. http://dx.doi.org/10.2139/ssrn.3326486.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Nechita, Ion, Simon Schmidt, and Moritz Weber. "Sinkhorn Algorithm for Quantum Permutation Groups." Experimental Mathematics, July 1, 2021, 1–13. http://dx.doi.org/10.1080/10586458.2021.1926005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Pichler, Alois, and Michael Weinhardt. "The nested Sinkhorn divergence to learn the nested distance." Computational Management Science, September 27, 2021. http://dx.doi.org/10.1007/s10287-021-00415-7.

Full text
Abstract:
AbstractThe nested distance builds on the Wasserstein distance to quantify the difference of stochastic processes, including also the evolution of information modelled by filtrations. The Sinkhorn divergence is a relaxation of the Wasserstein distance, which can be computed considerably faster. For this reason we employ the Sinkhorn divergence and take advantage of the related (fixed point) iteration algorithm. Furthermore, we investigate the transition of the entropy throughout the stages of the stochastic process and provide an entropy-regularized nested distance formulation, including a characterization of its dual. Numerical experiments affirm the computational advantage and supremacy.
APA, Harvard, Vancouver, ISO, and other styles
15

Marino, Simone Di, and Augusto Gerolin. "An Optimal Transport Approach for the Schrödinger Bridge Problem and Convergence of Sinkhorn Algorithm." Journal of Scientific Computing 85, no. 2 (October 19, 2020). http://dx.doi.org/10.1007/s10915-020-01325-7.

Full text
Abstract:
AbstractThis paper exploit the equivalence between the Schrödinger Bridge problem (Léonard in J Funct Anal 262:1879–1920, 2012; Nelson in Phys Rev 150:1079, 1966; Schrödinger in Über die umkehrung der naturgesetze. Verlag Akademie der wissenschaften in kommission bei Walter de Gruyter u, Company, 1931) and the entropy penalized optimal transport (Cuturi in: Advances in neural information processing systems, pp 2292–2300, 2013; Galichon and Salanié in: Matching with trade-offs: revealed preferences over competing characteristics. CEPR discussion paper no. DP7858, 2010) in order to find a different approach to the duality, in the spirit of optimal transport. This approach results in a priori estimates which are consistent in the limit when the regularization parameter goes to zero. In particular, we find a new proof of the existence of maximizing entropic-potentials and therefore, the existence of a solution of the Schrödinger system. Our method extends also when we have more than two marginals: the main new result is the proof that the Sinkhorn algorithm converges even in the continuous multi-marginal case. This provides also an alternative proof of the convergence of the Sinkhorn algorithm in two marginals.
APA, Harvard, Vancouver, ISO, and other styles
16

Mallasto, Anton, Augusto Gerolin, and Hà Quang Minh. "Entropy-regularized 2-Wasserstein distance between Gaussian measures." Information Geometry, August 16, 2021. http://dx.doi.org/10.1007/s41884-021-00052-8.

Full text
Abstract:
AbstractGaussian distributions are plentiful in applications dealing in uncertainty quantification and diffusivity. They furthermore stand as important special cases for frameworks providing geometries for probability measures, as the resulting geometry on Gaussians is often expressible in closed-form under the frameworks. In this work, we study the Gaussian geometry under the entropy-regularized 2-Wasserstein distance, by providing closed-form solutions for the distance and interpolations between elements. Furthermore, we provide a fixed-point characterization of a population barycenter when restricted to the manifold of Gaussians, which allows computations through the fixed-point iteration algorithm. As a consequence, the results yield closed-form expressions for the 2-Sinkhorn divergence. As the geometries change by varying the regularization magnitude, we study the limiting cases of vanishing and infinite magnitudes, reconfirming well-known results on the limits of the Sinkhorn divergence. Finally, we illustrate the resulting geometries with a numerical study.
APA, Harvard, Vancouver, ISO, and other styles
17

Chakrabarty, Deeparnab, and Sanjeev Khanna. "Better and simpler error analysis of the Sinkhorn–Knopp algorithm for matrix scaling." Mathematical Programming, April 15, 2020. http://dx.doi.org/10.1007/s10107-020-01503-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography