To see the other types of publications on this topic, follow the link: Kullback-Leibler Distance.

Journal articles on the topic 'Kullback-Leibler Distance'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Kullback-Leibler Distance.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Kim, Sang-Hyun. "Video Content Indexing using Kullback-Leibler Distance." International Journal of Contents 5, no. 4 (2009): 51–54. http://dx.doi.org/10.5392/ijoc.2009.5.4.051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rahmani, Somayeh, Vahid Khajehvand, and Mohsen Torabian. "Kullback-Leibler distance criterion consolidation in cloud." Journal of Network and Computer Applications 170 (November 2020): 102789. http://dx.doi.org/10.1016/j.jnca.2020.102789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Luan, Yu, Hong Zuo Li, and Ya Fei Wang. "Acoustic Features Selection of Speaker Verification Based on Average KL Distance." Applied Mechanics and Materials 373-375 (August 2013): 629–33. http://dx.doi.org/10.4028/www.scientific.net/amm.373-375.629.

Full text
Abstract:
This paper proposes a new Average Kullback-Leibler distance to make an optimal feature selection algorithm for the matching score fusion of speaker verification. The advantage of this novel distance is to overcome the shortcoming of the asymmetry of conventional Kullback-Leibler distance, which can ensure the accuracy and robustness of the computation of the information content between matching scores of two acoustic features. From the experimental results by a variety of fusion schemes, it is found that the matching score fusion between MFCC and residual phase gains most information content.
APA, Harvard, Vancouver, ISO, and other styles
4

Nielsen, Frank. "On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds." Entropy 22, no. 7 (2020): 713. http://dx.doi.org/10.3390/e22070713.

Full text
Abstract:
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat divergence derived from Tsallis entropy related to the conformal flattening of the Fisher-Rao geometry. We prove that the Voronoi diagrams of the Fisher-Rao distance, the chi square divergence, and the Kullback-Leibler divergences all coincide with a hyperbolic Voronoi diagram on the corresponding Cauchy location-scale parameters, and that the dual
APA, Harvard, Vancouver, ISO, and other styles
5

Qian, Dongyun, and Huifeng Jin. "Distance Metric with Kullback–Leibler Divergence for Classification." International Journal of Signal Processing, Image Processing and Pattern Recognition 10, no. 7 (2017): 157–66. http://dx.doi.org/10.14257/ijsip.2017.10.7.14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kulhavý, Rudolf. "A Kullback-Leibler distance approach to system identification." Annual Reviews in Control 20 (January 1996): 119–30. http://dx.doi.org/10.1016/s1367-5788(97)00010-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Coetzee, Frans M. "Correcting the Kullback–Leibler distance for feature selection." Pattern Recognition Letters 26, no. 11 (2005): 1675–83. http://dx.doi.org/10.1016/j.patrec.2005.01.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kulhavý, R. "A Kullback-Leibler distance approach to system identification." Annual Review in Automatic Programming 20 (1996): 119–30. http://dx.doi.org/10.1016/s0066-4138(97)00010-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jabari, Shabnam, Mohammad Rezaee, Fatemeh Fathollahi, and Yun Zhang. "Multispectral change detection using multivariate Kullback-Leibler distance." ISPRS Journal of Photogrammetry and Remote Sensing 147 (January 2019): 163–77. http://dx.doi.org/10.1016/j.isprsjprs.2018.11.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Veldhuis, R. "The centroid of the symmetrical Kullback-Leibler distance." IEEE Signal Processing Letters 9, no. 3 (2002): 96–99. http://dx.doi.org/10.1109/97.995827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Kulhavý, Rudolf. "A Kullback-Leibler Distance Approach to System Identification." IFAC Proceedings Volumes 28, no. 13 (1995): 55–66. http://dx.doi.org/10.1016/s1474-6670(17)45326-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Zhou, P., H. Li, and D. Clelland. "Pattern Recognition on Diesel Engine Working Conditions by Wavelet Kullback-Leibler Distance Method." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 219, no. 9 (2005): 879–87. http://dx.doi.org/10.1243/095440605x31706.

Full text
Abstract:
This article introduces a novel pattern recognition and fault diagnosis method for diesel engines. The method is developed from engine vibration signal analysis in combination with wavelet and Kullback-Leibler distance (KLD) approaches. The new approach is termed wavelet Kullback-Leibler distance (WKLD). Experimental data relating to piston and cylinder liner wear obtained from a production diesel engine are used to evaluate the newly developed method. A good agreement between the experimental data and the WKLD estimation is found. The results of this article suggest that WKLD is an advancemen
APA, Harvard, Vancouver, ISO, and other styles
13

Barmalzan, G., and T. Payandeh Najafabad. "Model Confidence Set Based on Kullback-Leibler Divergence Distance." Journal of Statistical Research of Iran 9, no. 2 (2013): 179–93. http://dx.doi.org/10.18869/acadpub.jsri.9.2.179.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Harrou, Fouzi, Ying Sun, and Muddu Madakyaru. "Kullback-Leibler distance-based enhanced detection of incipient anomalies." Journal of Loss Prevention in the Process Industries 44 (November 2016): 73–87. http://dx.doi.org/10.1016/j.jlp.2016.08.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Nielsen, Frank. "On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means." Entropy 21, no. 5 (2019): 485. http://dx.doi.org/10.3390/e21050485.

Full text
Abstract:
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen–Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using
APA, Harvard, Vancouver, ISO, and other styles
16

Kharazmi, Omid, and Narayanaswamy Balakrishnan. "Informational properties of transmuted distributions." Filomat 35, no. 13 (2021): 4287–303. http://dx.doi.org/10.2298/fil2113287k.

Full text
Abstract:
In this paper, we establish some informational properties of transmuted distributions. Specifically, we derive Shannon entropy, Gini?s mean difference, and Fisher and Bayes Fisher information measures of a transmuted distribution. Two extensions of Shannon entropy and Gini?s mean difference information measures are also provided. Finally, the distances between transmuted distribution and its components based on the Kullback-Leibler, chi-square and energy distance divergences are all derived.
APA, Harvard, Vancouver, ISO, and other styles
17

Bilal, Muhammad, Khuram Ali Khan, Ammara Nosheen, and Josip Pečarić. "Bounds of some divergence measures on time scales via Abel–Gontscharoff interpolation." Mathematica Slovaca 74, no. 2 (2024): 417–36. http://dx.doi.org/10.1515/ms-2024-0032.

Full text
Abstract:
Abstract In this paper, bounds of some divergence measures on time scales via Abel–Gontscharoff interpolation are construed. Inequalities involving Shannon entropy, Kullback–Leibler discrimination, triangle distance and Jeffrey distance, are studied as particular instances by using various types of convex functions. Several new bounds of certain divergence measures for some specified time scales are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
18

Obradovic, D., and G. Deco. "Information Maximization and Independent Component Analysis: Is There a Difference?" Neural Computation 10, no. 8 (1998): 2085–101. http://dx.doi.org/10.1162/089976698300016972.

Full text
Abstract:
This article provides a detailed and rigorous analysis of the two commonly used methods for redundancy reduction: linear independent component analysis (ICA) posed as a direct minimization of a suitably chosen redundancy measure and information maximization (InfoMax) of a continuous stochastic signal transmitted through an appropriate nonlinear network. The article shows analytically that ICA based on the Kullback-Leibler information as a redundancy measure and InfoMax lead to the same solution if the parameterization of the output nonlinear functions in the latter method is sufficiently rich.
APA, Harvard, Vancouver, ISO, and other styles
19

Jain, K., and Ram Saraswat. "A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence Measures." Journal of Applied Mathematics, Statistics and Informatics 8, no. 1 (2012): 17–32. http://dx.doi.org/10.2478/v10294-012-0002-6.

Full text
Abstract:
A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence MeasuresAn Information inequality by using convexity arguments and Jensen inequality is established in terms of Csiszar f-divergence measures. This inequality is applied in comparing particular divergences which play a fundamental role in Information theory, such as Kullback-Leibler distance, Hellinger discrimination, Chi-square distance, J-divergences and others.
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Kai, Jian Zhang, and Di Mao. "The uniform asymptotic normality of a matrix-T distribution." Filomat 35, no. 15 (2021): 5253–61. http://dx.doi.org/10.2298/fil2115253l.

Full text
Abstract:
Using the Kullback-Leibler distance between two density functions about a matrix T distribution and a matrix normal distribution, we obtain a Berry-Esseen boundary for the T distribution. Further, we give the condition under which a matrix T is uniformly asymptotically matrix normal distribution, and point out the convergence rate.
APA, Harvard, Vancouver, ISO, and other styles
21

Li, X. L., and J. S. Chen. "REGION-BASED FUZZY CLUSTERING IMAGE SEGMENTATION ALGORITHM WITH KULLBACK-LEIBLER DISTANCE." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences V-4-2020 (August 3, 2020): 27–31. http://dx.doi.org/10.5194/isprs-annals-v-4-2020-27-2020.

Full text
Abstract:
Abstract. To effectively describe the uncertainty of remote sensing image segmentation, a novel region-based algorithm using fuzzy clustering and Kullback-Leibler (KL) distance is proposed. By regular tessellation, the image domain is completely divided into several sub-blocks to overcome the complex noise existed in high-resolution remote sensing images. Taking the blocks as the basic processing units, KL divergence is used to model the distance between blocks and clusters, which enables the model to describe the uncertainty of the non-similarity relationship. Besides, based on the theory of
APA, Harvard, Vancouver, ISO, and other styles
22

Kanazawa, Yuichiro, Atsuyuki Kogure, and Sang-Gil Lee. "ON THE ASYMPTOTIC EQUIVALENCE OF HELLINGER DISTANCE AND KULLBACK-LEIBLER LOSS." JOURNAL OF THE JAPAN STATISTICAL SOCIETY 29, no. 1 (1999): 1–21. http://dx.doi.org/10.14490/jjss1995.29.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

AL-KHAFAJI, KARIM, SHRIPAD TULJAPURKAR, CAROL HORVITZ, and ANTHONY KOOP. "Detecting variability in demographic rates: randomization with the Kullback–Leibler distance." Journal of Ecology 95, no. 6 (2007): 1370–80. http://dx.doi.org/10.1111/j.1365-2745.2007.01296.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Shahriar, Hossain, Sarah M. North, YoonJi Lee, and Roger Hu. "Server-side code injection attack detection based on Kullback-Leibler distance." International Journal of Internet Technology and Secured Transactions 5, no. 3 (2014): 240. http://dx.doi.org/10.1504/ijitst.2014.065184.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Lin, Shi-Woei. "Jackknife evaluation of uncertainty judgments aggregated by the Kullback–Leibler distance." Applied Mathematics and Computation 218, no. 2 (2011): 469–79. http://dx.doi.org/10.1016/j.amc.2011.05.087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Kanazawa, Yuichiro. "Hellinger distance and Kullback—Leibler loss for the kernel density estimator." Statistics & Probability Letters 18, no. 4 (1993): 315–21. http://dx.doi.org/10.1016/0167-7152(93)90022-b.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Yuan, Li, Hongwei Liu, and Zheng Bao. "Radar HRRP recognition based on the minimum Kullback-Leibler Distance criterion." Journal of Electronics (China) 24, no. 2 (2007): 199–203. http://dx.doi.org/10.1007/s11767-005-0167-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Entezami, Alireza, Hashem Shariatmadar, and Abbas Karamodin. "Data-driven damage diagnosis under environmental and operational variability by novel statistical pattern recognition methods." Structural Health Monitoring 18, no. 5-6 (2018): 1416–43. http://dx.doi.org/10.1177/1475921718800306.

Full text
Abstract:
Feature extraction by time-series analysis and decision making through distance-based methods are powerful and efficient statistical pattern recognition techniques for data-driven structural health monitoring. The motivation of this article is to propose an innovative residual-based feature extraction approach based on AutoRegressive modeling and a novel statistical distance method named as Partition-based Kullback–Leibler Divergence for damage detection and localization by using randomly high-dimensional damage-sensitive features under environmental and operational variability. The key novel
APA, Harvard, Vancouver, ISO, and other styles
29

Adams, Johnathan A., Gentry White, and Robyn P. Araujo. "Mathematical measures of societal polarisation." PLOS ONE 17, no. 10 (2022): e0275283. http://dx.doi.org/10.1371/journal.pone.0275283.

Full text
Abstract:
In opinion dynamics, as in general usage, polarisation is subjective. To understand polarisation, we need to develop more precise methods to measure the agreement in society. This paper presents four mathematical measures of polarisation derived from graph and network representations of societies and information-theoretic divergences or distance metrics. Two of the methods, min-max flow and spectral radius, rely on graph theory and define polarisation in terms of the structural characteristics of networks. The other two methods represent opinions as probability density functions and use the Ku
APA, Harvard, Vancouver, ISO, and other styles
30

Xiong, Jun, Joon Wayn Cheong, Zhi Xiong, et al. "Fault-tolerant relative navigation based on Kullback–Leibler divergence." International Journal of Advanced Robotic Systems 17, no. 6 (2020): 172988142097912. http://dx.doi.org/10.1177/1729881420979125.

Full text
Abstract:
A fault-detection method for relative navigation based on Kullback–Leibler divergence (KLD) is proposed. Different from the traditional χ 2-based approaches, the KLD for a filter is following a hybrid distribution that combines χ 2 distribution and F-distribution. Using extended Kalman filter (EKF) as the estimator, the distance between the priori and posteriori data of EKF is calculated to detect the abnormal measurements. After fault detection step, a fault exclusion method is applied to remove the error observations from the fusion procedure. The proposed method is suitable for the Kalman f
APA, Harvard, Vancouver, ISO, and other styles
31

HE, ZHENYA, LUXI YANG, JU LIU, ZIYI LU, CHEN HE, and YUHUI SHI. "APPROACHES FOR BLIND SEPARATION OF SOURCES BASED ON MULTIVARIATE DENSITY ESTIMATION." Journal of Circuits, Systems and Computers 09, no. 03n04 (1999): 243–53. http://dx.doi.org/10.1142/s0218126699000207.

Full text
Abstract:
A class of learning algorithms is developed for the blind separation of independent source signals from their linear mixtures. The algorithms are based on the Kullback–Leibler distance. A multivariate density estimation technique is used in estimating the probability density function of independent components. Simulations using speech signals and images as sources illustrate the performances of the algorithms.
APA, Harvard, Vancouver, ISO, and other styles
32

Mehdi, Agouzal, Merzouqi Maria, and Moha Arouch. "Reduction of Hyperspectral image based on OSP and a Filter based on Bhattacharyya Distance." International Journal of Emerging Technology and Advanced Engineering 12, no. 4 (2022): 86–93. http://dx.doi.org/10.46338/ijetae0422_12.

Full text
Abstract:
Abstract— This article proposes a new method to reduce the dimensionality of the hyperspectral image of Pavia. It became obvious to reduce the hyperspectral image, before their classification. This reduction is done by several strategies and approaches according to the literature. The high dimensionality of the hyperspectral image was and remains a challenge to overcome. Since it contains labelled pixels that belong to the target area and others considered as intruders. For this reason, the proposed method aims to extract the classified pixels by the application of the orthogonal projection of
APA, Harvard, Vancouver, ISO, and other styles
33

Inokuchi, Ryo, and Sadaaki Miyamoto. "Fuzzyc-Means Algorithms Using Kullback-Leibler Divergence and Helliger Distance Based on Multinomial Manifold." Journal of Advanced Computational Intelligence and Intelligent Informatics 12, no. 5 (2008): 443–47. http://dx.doi.org/10.20965/jaciii.2008.p0443.

Full text
Abstract:
In this paper, we discuss fuzzy clustering algorithms for discrete data. Data space is represented as a statistical manifold of the multinomial distribution, and then the Euclidean distance are not adequate in this setting. The geodesic distance on the multinomial manifold can be derived analytically, but it is difficult to use it as a metric directly. We propose fuzzyc-means algorithms using other metrics: the Kullback-Leibler divergence and the Hellinger distance, instead of the Euclidean distance. These two metrics are regarded as approximations of the geodesic distance.
APA, Harvard, Vancouver, ISO, and other styles
34

Pham, Tuyen, Hana Dal Poz Kouřimská, and Hubert Wagner. "Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning." Machine Learning and Knowledge Extraction 7, no. 2 (2025): 48. https://doi.org/10.3390/make7020048.

Full text
Abstract:
The purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on extending it to the family of Bregman divergences, which includes the popular Kullback–Leibler divergence (also known as relative entropy). The resulting dissimilarity measure is called a Bregman–Hausdorff divergence and compares two collections of vectors—without assuming any pairing or alignment between their elements. We propose new algorithms for computing Bregman–Hausdorff divergences
APA, Harvard, Vancouver, ISO, and other styles
35

Ma, Jinshan. "Generalized Grey Target Decision Method for Mixed Attributes Based on Kullback-Leibler Distance." Entropy 20, no. 7 (2018): 523. http://dx.doi.org/10.3390/e20070523.

Full text
Abstract:
A novel generalized grey target decision method for mixed attributes based on Kullback-Leibler (K-L) distance is proposed. The proposed approach involves the following steps: first, all indices are converted into index binary connection number vectors; second, the two-tuple (determinacy, uncertainty) numbers originated from index binary connection number vectors are obtained; third, the positive and negative target centers of two-tuple (determinacy, uncertainty) numbers are calculated; then the K-L distances of all alternatives to their positive and negative target centers are integrated by th
APA, Harvard, Vancouver, ISO, and other styles
36

Perduca, V., and G. Nuel. "Measuring the Influence of Observations in HMMs Through the Kullback–Leibler Distance." IEEE Signal Processing Letters 20, no. 2 (2013): 145–48. http://dx.doi.org/10.1109/lsp.2012.2235830.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Do, M. N., and M. Vetterli. "Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance." IEEE Transactions on Image Processing 11, no. 2 (2002): 146–58. http://dx.doi.org/10.1109/83.982822.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Lin, H., Z. Ou, and X. Xiao. "Generalized Time-Series Active Search With Kullback–Leibler Distance for Audio Fingerprinting." IEEE Signal Processing Letters 13, no. 8 (2006): 465–68. http://dx.doi.org/10.1109/lsp.2006.874394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Parveen, Sultan, Sanjay Kumar Singh, Umesh Singh, and Dinesh Kumar. "A Comparative Study of Traditional and Kullback-Leibler Divergence of Survival Functions Estimators for the Parameter of Lindley Distribution." Austrian Journal of Statistics 48, no. 5 (2019): 45–53. http://dx.doi.org/10.17713/ajs.v48i5.772.

Full text
Abstract:
A new point estimation method based on Kullback-Leibler divergence of survival functions (KLS), measuring the distance between an empirical and prescribed survival functions, has been used to estimate the parameter of Lindley distribution. The simulation studies have been carried out to compare the performance of the proposed estimator with the corresponding Least square (LS), Maximum likelihood (ML) and Maximum product spacing (MPS) methods of estimation.
APA, Harvard, Vancouver, ISO, and other styles
40

Rebbouh, Amar. "Clustering Objects Described by Juxtaposition of Binary Data Tables." Journal of Applied Mathematics and Decision Sciences 2008 (January 14, 2008): 1–13. http://dx.doi.org/10.1155/2008/125797.

Full text
Abstract:
This paper seeks to develop an allocation of 0/1 data matrices to physical systems upon a Kullback-Leibler distance between probability distributions. The distributions are estimated from the contents of the data matrices. We discuss an ascending hierarchical classification method, a numerical example and mention an application with survey data concerning the level of development of the departments of a given territory of a country.
APA, Harvard, Vancouver, ISO, and other styles
41

Adil Khan, Muhammad, Slavica Ivelić Bradanović, and Haitham Abbas Mahmoud. "New Improvements of the Jensen–Mercer Inequality for Strongly Convex Functions with Applications." Axioms 13, no. 8 (2024): 553. http://dx.doi.org/10.3390/axioms13080553.

Full text
Abstract:
In this paper, we use the generalized version of convex functions, known as strongly convex functions, to derive improvements to the Jensen–Mercer inequality. We achieve these improvements through the newly discovered characterizations of strongly convex functions, along with some previously known results about strongly convex functions. We are also focused on important applications of the derived results in information theory, deducing estimates for χ-divergence, Kullback–Leibler divergence, Hellinger distance, Bhattacharya distance, Jeffreys distance, and Jensen–Shannon divergence. Additiona
APA, Harvard, Vancouver, ISO, and other styles
42

Amari, S., and A. Cichocki. "Information geometry of divergence functions." Bulletin of the Polish Academy of Sciences: Technical Sciences 58, no. 1 (2010): 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.

Full text
Abstract:
Information geometry of divergence functionsMeasures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, Kullback-Leibler divergence andf-divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. T
APA, Harvard, Vancouver, ISO, and other styles
43

PORTESI, MARIELA, ANGEL L. PLASTINO, and FLAVIA PENNINI. "INFORMATION GEOMETRY AND PHASE TRANSITIONS." International Journal of Modern Physics B 20, no. 30n31 (2006): 5250–53. http://dx.doi.org/10.1142/s0217979206036338.

Full text
Abstract:
We present, from an information theoretic viewpoint, an analysis of phase transitions and critical phenomena in quantum systems. Our study is based on geometrical considerations within the Riemannian space of thermodynamic parameters that characterize the system. A metric for the space can be derived from an appropriate definition of distance between quantum states. For this purpose, we consider generalized α-divergences that include the standard Kullback–Leibler relative entropy. The use of other measures of information distance is taken into account, and the thermodynamic stability of the sy
APA, Harvard, Vancouver, ISO, and other styles
44

Al-Labadi, Luai, Petru Ciur, Milutin Dimovic, and Kyuson Lim. "Assessing Multinomial Distributions with a Bayesian Approach." Mathematics 11, no. 13 (2023): 3007. http://dx.doi.org/10.3390/math11133007.

Full text
Abstract:
This paper introduces a unified Bayesian approach for testing various hypotheses related to multinomial distributions. The method calculates the Kullback–Leibler divergence between two specified multinomial distributions, followed by comparing the change in distance from the prior to the posterior through the relative belief ratio. A prior elicitation algorithm is used to specify the prior distributions. To demonstrate the effectiveness and practical application of this approach, it has been applied to several examples.
APA, Harvard, Vancouver, ISO, and other styles
45

Xie, Li, Liangyan Li, Jun Chen, Lei Yu, and Zhongshan Zhang. "A Constrained Talagrand Transportation Inequality with Applications to Rate-Distortion-Perception Theory." Entropy 27, no. 4 (2025): 441. https://doi.org/10.3390/e27040441.

Full text
Abstract:
A constrained version of Talagrand’s transportation inequality is established, which reveals an intrinsic connection between the Gaussian distortion-rate-perception functions with limited common randomness under the Kullback–Leibler divergence-based and squared Wasserstein-2 distance-based perception measures. This connection provides an organizational framework for assessing existing bounds on these functions. In particular, we show that the best-known bounds of Xie et al. are nonredundant when examined through this connection.
APA, Harvard, Vancouver, ISO, and other styles
46

Cabella, Brenno Caetano Troca, Márcio Júnior Sturzbecher, Walfred Tedeschi, Oswaldo Baffa Filho, Dráulio Barros de Araújo, and Ubiraci Pereira da Costa Neves. "A numerical study of the Kullback-Leibler distance in functional magnetic resonance imaging." Brazilian Journal of Physics 38, no. 1 (2008): 20–25. http://dx.doi.org/10.1590/s0103-97332008000100005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Nait Ali, Amine, Hanene Brahmia, Mohamed Boughazi, Layachi Bennacer, and Toufik Hafs. "Using empirical mode decomposition and Kullback-Leibler distance for online handwritten signature verification." International Journal of Applied Pattern Recognition 6, no. 1 (2019): 15. http://dx.doi.org/10.1504/ijapr.2019.10026037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Hafs, Toufik, Layachi Bennacer, Hanene Brahmia, Mohamed Boughazi, and Amine Nait Ali. "Using empirical mode decomposition and Kullback-Leibler distance for online handwritten signature verification." International Journal of Applied Pattern Recognition 6, no. 1 (2019): 15. http://dx.doi.org/10.1504/ijapr.2019.104279.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Sun, Wenyan, and Enqing Dong. "Kullback-Leibler distance and graph cuts based active contour model for local segmentation." Biomedical Signal Processing and Control 52 (July 2019): 120–27. http://dx.doi.org/10.1016/j.bspc.2019.04.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Hanjri, Adnane El, Aawatif Hayar, and Abdelkrim Haqiq. "Features detection based blind handover using kullback leibler distance for 5G HetNets systems." IAES International Journal of Artificial Intelligence (IJ-AI) 9, no. 2 (2020): 193. http://dx.doi.org/10.11591/ijai.v9.i2.pp193-202.

Full text
Abstract:
<p>The Fifth Generation of Mobile Networks (5G) is changing the cellular network infrastructure paradigm, and Small Cells are a key piece of this shift. But the high number of Small Cells and their low coverage involve more Handovers to provide continuous connectivity, and the selection, quickly and at low energy cost, of the appropriate one in the vicinity of thousands is also a key problem. In this paper, we propose a new method, to have an efficient, blind and rapid handover just by analysing Received Signal probability density function instead of demodulating and analysing Received S
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!