To see the other types of publications on this topic, follow the link: Definitions of entropy.

Journal articles on the topic 'Definitions of entropy'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Definitions of entropy.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yu, Bicheng, Xuejun Zhao, Mingfa Zheng, Xiujiu Yuan, and Bei Hou. "Entropy on Intuitionistic Fuzzy Sets and Hesitant Fuzzy Sets." Journal of Mathematics 2022 (February 7, 2022): 1–10. http://dx.doi.org/10.1155/2022/1585079.

Full text
Abstract:
Since the sufficient conditions for the maximum value of the intuitionistic fuzzy entropy are not unified and the hesitant fuzzy entropy cannot be compared when the lengths of the hesitation fuzzy elements are not equal, improved axiomatic definitions of intuitionistic fuzzy entropy and hesitant fuzzy entropy are proposed, and new intuitionistic fuzzy entropy and hesitant fuzzy entropy based on the improved axiomatic definitions are established. This paper defines the fuzzy entropy that satisfies the properties based on the axiomatized definition of fuzzy entropy and, based on the fuzzy entrop
APA, Harvard, Vancouver, ISO, and other styles
2

Anderson, Neal G. "Irreversible information loss: Fundamental notions and entropy costs." International Journal of Modern Physics: Conference Series 33 (January 2014): 1460354. http://dx.doi.org/10.1142/s2010194514603548.

Full text
Abstract:
Landauer's Principle (LP) associates an entropy increase with the irreversible loss of information from a physical system. Clear statement, unambiguous interpretation, and proper application of LP requires precise, mutually consistent, and sufficiently general definitions for a set of interlocking fundamental notions and quantities (entropy, information, irreversibility, erasure). In this work, we critically assess some common definitions and quantities used or implied in statements of LP, and reconsider their definition within an alternative “referential” approach to physical information theo
APA, Harvard, Vancouver, ISO, and other styles
3

Jia, Lifen. "A New Definition of Cross-Entropy for Uncertain Sets." Journal of Uncertain Systems 14, no. 02 (2021): 2150013. http://dx.doi.org/10.1142/s1752890921500136.

Full text
Abstract:
Cross-entropy for uncertain sets is used to measure the divergence between two membership functions. Sine cross-entropy, logarithm cross-entropy and quadratic cross-entropy for uncertain sets have been proposed, all the definitions fail to measure the degree of divergence associated with some uncertain sets. Thus, this paper presents a new definition of cross-entropy for uncertain sets as a supplement, and investigates its properties. In addition, this paper also proposes a definition of generalized cross-entropy for uncertain sets, and discusses its properties.
APA, Harvard, Vancouver, ISO, and other styles
4

Qing, Ming. "The Unique Representations of Fuzzy Entropy for a Finite Fuzzy Set." Applied Mechanics and Materials 433-435 (October 2013): 766–69. http://dx.doi.org/10.4028/www.scientific.net/amm.433-435.766.

Full text
Abstract:
Many methods were presented to define fuzzy entropy to measure fuzzy degree of a fuzzy set and a variety of fuzzy entropy formulae were derived and constructed from the definitions of fuzzy entropy. In this paper, a new definition of fuzzy entropy is presented based on a simple order relation and computation formulae of fuzzy entropy is given. Then, the unique representations of fuzzy entropy are given by applying several set of reasonable conditions to fuzzy entropy.
APA, Harvard, Vancouver, ISO, and other styles
5

Teixeira, Andreia, André Souto, and Luís Antunes. "On Conditional Tsallis Entropy." Entropy 23, no. 11 (2021): 1427. http://dx.doi.org/10.3390/e23111427.

Full text
Abstract:
There is no generally accepted definition for conditional Tsallis entropy. The standard definition of (unconditional) Tsallis entropy depends on a parameter α that converges to the Shannon entropy as α approaches 1. In this paper, we describe three proposed definitions of conditional Tsallis entropy suggested in the literature—their properties are studied and their values, as a function of α, are compared. We also consider another natural proposal for conditional Tsallis entropy and compare it with the existing ones. Lastly, we present an online tool to compute the four conditional Tsallis ent
APA, Harvard, Vancouver, ISO, and other styles
6

Gao, Ming Mei, Tao Sun, Rui Ping Xu, Li Song, and Hui Qun Zhang. "An Improved Axiomatic Definition and Structural Formula of Interval-Valued Intuitionistic Fuzzy Entropy." Applied Mechanics and Materials 644-650 (September 2014): 1556–59. http://dx.doi.org/10.4028/www.scientific.net/amm.644-650.1556.

Full text
Abstract:
In view of the defects of existing axiomatic definitions of interval-valued intuitionistic fuzzy entropy, an improved axiomatic definition of the interval-valued intuitionistic fuzzy entropy is presented and the corresponding formula is structured in the paper. Firstly, defects in three kinds of the axiomatic system of entropy of IVIFS are analyzed in detail respectively. New axiomatic entropy definition is proposed for the sake of more reasonable depiction of the fuzziness of IVIFS. Based on the new axiomatic entropy definition, a new computational formula is mentioned. Finally, an example is
APA, Harvard, Vancouver, ISO, and other styles
7

Livadiotis, George, and David J. McComas. "Thermodynamic Definitions of Temperature and Kappa and Introduction of the Entropy Defect." Entropy 23, no. 12 (2021): 1683. http://dx.doi.org/10.3390/e23121683.

Full text
Abstract:
This paper develops explicit and consistent definitions of the independent thermodynamic properties of temperature and the kappa index within the framework of nonextensive statistical mechanics and shows their connection with the formalism of kappa distributions. By defining the “entropy defect” in the composition of a system, we show how the nonextensive entropy of systems with correlations differs from the sum of the entropies of their constituents of these systems. A system is composed extensively when its elementary subsystems are independent, interacting with no correlations; this leads t
APA, Harvard, Vancouver, ISO, and other styles
8

Stoyanov, Luchezar. "Topological and metric entropy for group and semigroup actions." Topological Algebra and its Applications 7, no. 1 (2019): 38–47. http://dx.doi.org/10.1515/taa-2019-0004.

Full text
Abstract:
AbstractIt is well-known that the classical definition of topological entropy for group and semigroup actions is frequently zero in some rather interesting situations, e.g. smooth actions of ℤk+ (k >1) on manifolds. Different definitions have been considered by several authors. In the present article we describe the one proposed in 1995 by K.H.Hofmann and the author which produces topological entropy not trivially zero for such smooth actions. We discuss this particular approach, and also some of the main properties of the topological entropy defined in this way, its advantages and disadvan
APA, Harvard, Vancouver, ISO, and other styles
9

Ben-Naim, Arieh. "Entropy and Time." Entropy 22, no. 4 (2020): 430. http://dx.doi.org/10.3390/e22040430.

Full text
Abstract:
The idea that entropy is associated with the “arrow of time” has its roots in Clausius’s statement on the Second Law: “Entropy of the Universe always increases.” However, the explicit association of the entropy with time’s arrow arises from Eddington. In this article, we start with a brief review of the idea that the “increase in entropy” is somehow associated with the direction in which time increases. Then, we examine three different, but equivalent definitions of entropy. We find that none of these definitions indicate any hint of a relationship between entropy and time. We can, therefore,
APA, Harvard, Vancouver, ISO, and other styles
10

Grzywacz, Norberto M. "Perceptual Complexity as Normalized Shannon Entropy." Entropy 27, no. 2 (2025): 166. https://doi.org/10.3390/e27020166.

Full text
Abstract:
Complexity is one of the most important variables in how the brain performs decision making based on esthetic values. Multiple definitions of perceptual complexity have been proposed, with one of the most fruitful being the Normalized Shannon Entropy one. However, the Normalized Shannon Entropy definition has theoretical gaps that we address in this article. Focusing on visual perception, we first address whether normalization fully corrects for the effects of measurement resolution on entropy. The answer is negative, but the remaining effects are minor, and we propose alternate definitions of
APA, Harvard, Vancouver, ISO, and other styles
11

Shahsavari, Saeed, and S. M. Ali Boutorabi. "Innovative development of borchers remarks on the second law of thermodynamics using quasi-statistical approach to the entropy." MOJ Applied Bionics and Biomechanics 7, no. 1 (2023): 83–86. http://dx.doi.org/10.15406/mojabb.2023.07.00179.

Full text
Abstract:
Borchers classical remarks raise important aspects of the second law of classical thermodynamics considering temperature as an integrating denominator as well as using thermal and mechanical variables classes by an innovative structural-based statement for the first law of thermodynamics.1 However, he advised seriously that his remarks need to be discussed using other approaches to entropy, and some further remarks will be very useful. Of course, the possibility of applying Borchers approach on the other entropy definitions is involved in various mathematical and physical challenges, and canno
APA, Harvard, Vancouver, ISO, and other styles
12

Evans, Denis J., Debra J. Searles, and Stephen R. Williams. "A Derivation of the Gibbs Equation and the Determination of Change in Gibbs Entropy from Calorimetry." Australian Journal of Chemistry 69, no. 12 (2016): 1413. http://dx.doi.org/10.1071/ch16447.

Full text
Abstract:
In this paper, we give a succinct derivation of the fundamental equation of classical equilibrium thermodynamics, namely the Gibbs equation. This derivation builds on our equilibrium relaxation theorem for systems in contact with a heat reservoir. We reinforce the comments made over a century ago, pointing out that Clausius’ strict inequality for a system of interest is within Clausius’ set of definitions, logically undefined. Using a specific definition of temperature that we have recently introduced and which is valid for both reversible and irreversible processes, we can define a property t
APA, Harvard, Vancouver, ISO, and other styles
13

STRZAŁKA, DOMINIK, and FRANCISZEK GRABOWSKI. "A SHORT REVIEW OF ELEMENTARY PROPERTIES AND POSSIBLE APPLICATIONS OF DEFORMED q-ALGEBRA DERIVED FROM NON-EXTENSIVE TSALLIS ENTROPY." Modern Physics Letters B 22, no. 16 (2008): 1525–34. http://dx.doi.org/10.1142/s0217984908016261.

Full text
Abstract:
Tsallis entropy introduced in 1988 is considered to have obtained new possibilities to construct generalized thermodynamical basement for statistical physics expanding classical Boltzmann–Gibbs–Shannon thermodynamics for non-equilibrium states. During the last two decades this q-generalized theory has been successfully applied to a considerable amount of physically interesting complex phenomena. The authors would like to present a short rewiev, the applications and the elementary properties of some operators in deformed q-algebra derived from Tsallis definition of non-extensive entropy based o
APA, Harvard, Vancouver, ISO, and other styles
14

Matty, Michael, Lachlan Lancaster, William Griffin, and Robert H. Swendsen. "Comparison of canonical and microcanonical definitions of entropy." Physica A: Statistical Mechanics and its Applications 467 (February 2017): 474–89. http://dx.doi.org/10.1016/j.physa.2016.10.030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Horowitz, Jordan M., and Takahiro Sagawa. "Equivalent Definitions of the Quantum Nonadiabatic Entropy Production." Journal of Statistical Physics 156, no. 1 (2014): 55–65. http://dx.doi.org/10.1007/s10955-014-0991-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Petroni, Nicola. "Entropy and Its Discontents: A Note on Definitions." Entropy 16, no. 7 (2014): 4044–59. http://dx.doi.org/10.3390/e16074044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Pal, N. R., and S. K. Pal. "Object-background segmentation using new definitions of entropy." IEE Proceedings E Computers and Digital Techniques 136, no. 4 (1989): 284. http://dx.doi.org/10.1049/ip-e.1989.0039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Durt, Thomas. "Competing Definitions of Information Versus Entropy in Physics." Foundations of Science 16, no. 4 (2011): 315–18. http://dx.doi.org/10.1007/s10699-010-9206-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Liang, Yingjie. "Diffusion entropy method for ultraslow diffusion using inverse Mittag-Leffler function." Fractional Calculus and Applied Analysis 21, no. 1 (2018): 104–17. http://dx.doi.org/10.1515/fca-2018-0007.

Full text
Abstract:
Abstract This study analyzes the complexity of ultraslow diffusion process using both the classical Shannon entropy and its general case with inverse Mittag-Leffler function in conjunction with the structural derivative. To further describe the observation process with information loss in ultraslow diffusion, e.g., some defects in the observation process, two definitions of fractional entropy are proposed by using the inverse Mittag-Leffler function, in which the Pade approximation technique is employed to numerically estimate the diffusion entropy. The results reveal that the inverse Mittag-L
APA, Harvard, Vancouver, ISO, and other styles
20

Guńka, Piotr A., and Janusz Zachara. "Towards a quantitative bond valence description of coordination spheres – the concepts of valence entropy and valence diversity coordination numbers." Acta Crystallographica Section B Structural Science, Crystal Engineering and Materials 75, no. 1 (2019): 86–96. http://dx.doi.org/10.1107/s2052520618017833.

Full text
Abstract:
Two novel definitions of chemical coordination numbers – valence entropy coordination number n VECN and valence diversity coordination number n VDCN – are proposed. Their originality stems from the fact that they are the first definitions based solely on bond valences. The expressions for them are derived from their definitions and their properties are studied. The unexpected close relationship of n VECN to Shannon entropy and n VDCN to diversity are revealed and the names of the new coordination numbers are taken therefrom. Finally, as an example, a study of arsenic(III) lone electron pair st
APA, Harvard, Vancouver, ISO, and other styles
21

Lopes, António M., and José A. Tenreiro Machado. "A Review of Fractional Order Entropies." Entropy 22, no. 12 (2020): 1374. http://dx.doi.org/10.3390/e22121374.

Full text
Abstract:
Fractional calculus (FC) is the area of calculus that generalizes the operations of differentiation and integration. FC operators are non-local and capture the history of dynamical effects present in many natural and artificial phenomena. Entropy is a measure of uncertainty, diversity and randomness often adopted for characterizing complex dynamical systems. Stemming from the synergies between the two areas, this paper reviews the concept of entropy in the framework of FC. Several new entropy definitions have been proposed in recent decades, expanding the scope of applicability of this seminal
APA, Harvard, Vancouver, ISO, and other styles
22

Palazzo, Pierfrancesco. "Chemical and Mechanical Aspect of Entropy-Exergy Relationship." Entropy 23, no. 8 (2021): 972. http://dx.doi.org/10.3390/e23080972.

Full text
Abstract:
The present research focuses the chemical aspect of entropy and exergy properties. This research represents the complement of a previous treatise already published and constitutes a set of concepts and definitions relating to the entropy–exergy relationship overarching thermal, chemical and mechanical aspects. The extended perspective here proposed aims at embracing physical and chemical disciplines, describing macroscopic or microscopic systems characterized in the domain of industrial engineering and biotechnologies. The definition of chemical exergy, based on the Carnot chemical cycle, is c
APA, Harvard, Vancouver, ISO, and other styles
23

Purvis, Ben, Yong Mao, and Darren Robinson. "Entropy and its Application to Urban Systems." Entropy 21, no. 1 (2019): 56. http://dx.doi.org/10.3390/e21010056.

Full text
Abstract:
Since its conception over 150 years ago, entropy has enlightened and confused scholars and students alike, from its origins in physics and beyond. More recently, it has been considered within the urban context in a rather eclectic range of applications. The entropy maximization approach, as applied by Alan Wilson and others from the 1960s, contrasts with considerations from the 1990s of the city as a thermodynamic dissipative system, in the tradition of Ilya Prigogine. By reviewing the relevant mathematical theory, we draw the distinction among three interrelated definitions of entropy, the th
APA, Harvard, Vancouver, ISO, and other styles
24

Pessoa, Pedro, and Bruno Arderucio Costa. "Comment on “Black Hole Entropy: A Closer Look”." Entropy 22, no. 10 (2020): 1110. http://dx.doi.org/10.3390/e22101110.

Full text
Abstract:
In a recent paper (Entropy 2020, 22(1), 17), Tsallis states that entropy—as in Shannon or Kullback–Leiber’s definitions—is inadequate to interpret black hole entropy and suggests that a new non-additive functional should take the role of entropy. Here we counterargue by explaining the important distinction between the properties of extensivity and additivity; the latter is fundamental for entropy, while the former is a property of particular thermodynamical systems that is not expected for black holes. We also point out other debatable statements in his analysis of black hole entropy.
APA, Harvard, Vancouver, ISO, and other styles
25

Li, Lianhuang, and Fuyuan Guo. "Entropy-based definitions of beam parameters for slab waveguide." Journal of Optics 43, no. 4 (2014): 325–29. http://dx.doi.org/10.1007/s12596-014-0212-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Gurevich, B. M. "Toward the History of Dynamical Entropy: Comparing Two Definitions." Journal of Mathematical Sciences 215, no. 6 (2016): 693–99. http://dx.doi.org/10.1007/s10958-016-2874-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Huyskens, P. L., and G. G. Siegel. "Fundamental questions about entropy I. Definitions: Clausius or Boltzmann?" Bulletin des Sociétés Chimiques Belges 97, no. 11-12 (2010): 809–14. http://dx.doi.org/10.1002/bscb.19880971102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Deng, Xue, Tao Lin, and Chuangjie Chen. "Comparison and Research on Diversified Portfolios with Several Entropy Measures Based on Different Psychological States." Entropy 22, no. 10 (2020): 1125. http://dx.doi.org/10.3390/e22101125.

Full text
Abstract:
In previous studies, there were few portfolio models involving investors’ psychological states, market ambiguity and entropy. Some entropy can make the model have the effect of diversifying investment, which is very important. This paper mainly studies four kinds of entropy. First, we obtained four definitions of entropy from the literature, and gave the function of fuzzy entropy in different psychological states through strict mathematical proof. Then, we construct a fuzzy portfolio entropy decision model based on the investor’s psychological states, and compared it with the possibilistic mea
APA, Harvard, Vancouver, ISO, and other styles
29

Addabbo, Raymond, and Denis Blackmore. "A Dynamical Systems-Based Hierarchy for Shannon, Metric and Topological Entropy." Entropy 21, no. 10 (2019): 938. http://dx.doi.org/10.3390/e21100938.

Full text
Abstract:
A rigorous dynamical systems-based hierarchy is established for the definitions of entropy of Shannon (information), Kolmogorov–Sinai (metric) and Adler, Konheim & McAndrew (topological). In particular, metric entropy, with the imposition of some additional properties, is proven to be a special case of topological entropy and Shannon entropy is shown to be a particular form of metric entropy. This is the first of two papers aimed at establishing a dynamically grounded hierarchy comprising Clausius, Boltzmann, Gibbs, Shannon, metric and topological entropy in which each element is ideally a
APA, Harvard, Vancouver, ISO, and other styles
30

Liu, Peide, Xiaohong Zhang, and Zhanyou Wang. "An Extended VIKOR Method for Multiple Attribute Decision Making with Linguistic D Numbers Based on Fuzzy Entropy." International Journal of Information Technology & Decision Making 19, no. 01 (2020): 143–67. http://dx.doi.org/10.1142/s0219622019500433.

Full text
Abstract:
The linguistic [Formula: see text] numbers (LDNs) can express the fuzzy evaluation information more easily and precisely by combining the advantages of linguistic terms (LTs) and [Formula: see text] numbers (DNs). Existing researches on the fuzzy entropy of LDNs are rare, and most of the definitions of fuzzy entropy for LDNs are unreasonable. In view of this research gap, this paper improves the definition of fuzzy entropy of LDNs, which simultaneously considers the effects of confidence degrees and LTs on the value of fuzzy entropy in LDNs. Then, the weights of attributes can be calculated by
APA, Harvard, Vancouver, ISO, and other styles
31

Ben-Naim, Arieh. "Information, Entropy, Life, and the Universe." Entropy 24, no. 11 (2022): 1636. http://dx.doi.org/10.3390/e24111636.

Full text
Abstract:
In (2015), I wrote a book with the same title as this article. The book’s subtitle is: “What we know and what we do not know.” On the book’s dedication page, I wrote [1]: “This book is dedicated to readers of popular science books who are baffled, perplexed, puzzled, astonished, confused, and discombobulated by reading about Information, Entropy, Life and the Universe.” In the first part of this article, I will present the definitions of two central concepts: the “Shannon measure of information” (SMI), in Information Theory, and “Entropy”, in Thermodynamics. Following these definitions, I will
APA, Harvard, Vancouver, ISO, and other styles
32

Jürgensen, Helmut, and David Matthews. "Entropy and Higher Moments of Information." JUCS - Journal of Universal Computer Science 16, no. (5) (2010): 749–94. https://doi.org/10.3217/jucs-016-05-0749.

Full text
Abstract:
The entropy of a finite probability space or, equivalently, a memoryless source is the average information content of an event. The fact that entropy is an expectation suggests that it could be quite important in certain applications to take into account higher moments of information and parameters derived from these like the variance or skewness. In this paper we initiate a study of the higher moments of information for sources without memory and sources with memory. We derive properties of these moments for information defined in the sense of Shannon and indicate how these considerations can
APA, Harvard, Vancouver, ISO, and other styles
33

Atenas, Boris, and Sergio Curilef. "Complexity Measures in the Tight-Binding Model." Journal of Physics: Conference Series 2839, no. 1 (2024): 012010. http://dx.doi.org/10.1088/1742-6596/2839/1/012010.

Full text
Abstract:
Abstract The deformation of a wave packet is a significant topic in classical and quantum mechanics. Understanding this phenomenon is relevant in the study of various physical systems. In this work, we characterize the evolution of a highly localized wave packet in a tight-binding lattice. We investigate the behavior of the probability distribution associated with the wave packet and the accompanying complexity measures. We take information entropy, disequilibrium, disorder, and complexity measures to describe the localization-delocalization process from a highly localized initial pulse, showi
APA, Harvard, Vancouver, ISO, and other styles
34

HU, QINGHUA, and DAREN YU. "ENTROPIES OF FUZZY INDISCERNIBILITY RELATION AND ITS OPERATIONS." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 12, no. 05 (2004): 575–89. http://dx.doi.org/10.1142/s0218488504003089.

Full text
Abstract:
Yager's entropy was proposed to compute the information of fuzzy indiscernibility relation. In this paper we present a novel interpretation of Yager's entropy in discernibility power of a relation point of view. Then some basic definitions in Shannon's information theory are generalized based on Yager's entropy. We introduce joint entropy, conditional entropy, mutual information and relative entropy to compute the information changes for fuzzy indiscerniblity relation operations. Conditional entropy and relative conditional entropy are proposed to measure the information increment, which is in
APA, Harvard, Vancouver, ISO, and other styles
35

Taghiyev, Murad R. "Problems associated with the determination of risks in the financial markets." Caucasus Journal of Social Sciences 16, no. 1 (2023): 128–35. http://dx.doi.org/10.62343/cjss.2023.233.

Full text
Abstract:
The article is devoted to the problem of measuring risks in financial markets. From various complex scientific definitions, it synthesizes an intelligent definition of entropy. Furthermore, risk is considered a probabilistic criterion, described in connection with the normal probability distribution. The next step suggests a method for calculating the risk index for practitioners not burdened with deep mathematical knowledge, using the covariance index and the determination of the Sharpe ratio. Recommendations are provided at the end of the article.
APA, Harvard, Vancouver, ISO, and other styles
36

Li, Zhiming. "Remarks on Topological Entropy of Nonautonomous Dynamical Systems." International Journal of Bifurcation and Chaos 25, no. 12 (2015): 1550158. http://dx.doi.org/10.1142/s0218127415501588.

Full text
Abstract:
In this paper, we give several classical definitions of topological entropy (on a noncompact and noninvariant subset) for nonautonomous dynamical system. Furthermore, their relationships are established.
APA, Harvard, Vancouver, ISO, and other styles
37

NAUDTS, JAN. "CONTINUITY OF A CLASS OF ENTROPIES AND RELATIVE ENTROPIES." Reviews in Mathematical Physics 16, no. 06 (2004): 809–22. http://dx.doi.org/10.1142/s0129055x04002151.

Full text
Abstract:
The present paper studies continuity of generalized entropy functions and relative entropies defined using the notion of a deformed logarithmic function. In particular, two distinct definitions of relative entropy are discussed. As an application, all considered entropies are shown to satisfy Lesche's stability condition. The entropies of Tsallis' non-extensive thermostatistics are taken as examples.
APA, Harvard, Vancouver, ISO, and other styles
38

Imanian, Anahita, and Mohammad Modarres. "Thermodynamics as a fundamental science of reliability." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 230, no. 6 (2016): 598–608. http://dx.doi.org/10.1177/1748006x16679578.

Full text
Abstract:
Cumulative hazard and cumulative damage are important models for reliability and structural integrity assessment. This article reviews a previously developed thermodynamic entropy–based damage model and derives and demonstrates an equivalent reliability function. As such, a thermodynamically inspired approach to developing new definitions of cumulative hazard, cumulative damage, and life models of structures and components based on the second law of thermodynamics is presented. The article defines a new unified measure of damage in terms of energy dissipation associated with multiple interacti
APA, Harvard, Vancouver, ISO, and other styles
39

Ellerman, David. "Introduction to Quantum Logical Information Theory: Talk." EPJ Web of Conferences 182 (2018): 02039. http://dx.doi.org/10.1051/epjconf/201818202039.

Full text
Abstract:
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized using the distinctions (“dits”) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purp
APA, Harvard, Vancouver, ISO, and other styles
40

Ellerman, David. "Logical Entropy: Introduction to Classical and Quantum Logical Information Theory." Entropy 20, no. 9 (2018): 679. http://dx.doi.org/10.3390/e20090679.

Full text
Abstract:
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions (“dits”) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose
APA, Harvard, Vancouver, ISO, and other styles
41

Manis, George, Dimitrios Bakalis, and Roberto Sassi. "A Multithreaded Algorithm for the Computation of Sample Entropy." Algorithms 16, no. 6 (2023): 299. http://dx.doi.org/10.3390/a16060299.

Full text
Abstract:
Many popular entropy definitions for signals, including approximate and sample entropy, are based on the idea of embedding the time series into an m-dimensional space, aiming to detect complex, deeper and more informative relationships among samples. However, for both approximate and sample entropy, the high computational cost is a severe limitation. Especially when large amounts of data are processed, or when parameter tuning is employed premising a large number of executions, the necessity of fast computation algorithms becomes urgent. In the past, our research team proposed fast algorithms
APA, Harvard, Vancouver, ISO, and other styles
42

Tamir, Boaz, Ismael Lucas De Paiva, Zohar Schwartzman-Nowik, and Eliahu Cohen. "Quantum logical entropy: fundamentals and general properties." 4open 5 (2022): 2. http://dx.doi.org/10.1051/fopen/2021005.

Full text
Abstract:
Logical entropy gives a measure, in the sense of measure theory, of the distinctions of a given partition of a set, an idea that can be naturally generalized to classical probability distributions. Here, we analyze how this fundamental concept and other related definitions can be applied to the study of quantum systems with the use of quantum logical entropy. Moreover, we prove several properties of this entropy for generic density matrices that may be relevant to various areas of quantum mechanics and quantum information. Furthermore, we extend the notion of quantum logical entropy to post-se
APA, Harvard, Vancouver, ISO, and other styles
43

Cholewa, Marcin, and Bartłomiej Płaczek. "Application of Positional Entropy to Fast Shannon Entropy Estimation for Samples of Digital Signals." Entropy 22, no. 10 (2020): 1173. http://dx.doi.org/10.3390/e22101173.

Full text
Abstract:
This paper introduces a new method of estimating Shannon entropy. The proposed method can be successfully used for large data samples and enables fast computations to rank the data samples according to their Shannon entropy. Original definitions of positional entropy and integer entropy are discussed in details to explain the theoretical concepts that underpin the proposed approach. Relations between positional entropy, integer entropy and Shannon entropy were demonstrated through computational experiments. The usefulness of the introduced method was experimentally verified for various data sa
APA, Harvard, Vancouver, ISO, and other styles
44

Petersen, Karl. "Chains, entropy, coding." Ergodic Theory and Dynamical Systems 6, no. 3 (1986): 415–48. http://dx.doi.org/10.1017/s014338570000359x.

Full text
Abstract:
AbstractVarious definitions of the entropy for countable-state topological Markov chains are considered. Concrete examples show that these quantities do not coincide in general and can behave badly under nice maps. Certain restricted random walks which arise in a problem in magnetic recording provide interesting examples of chains. Factors of some of these chains have entropy equal to the growth rate of the number of periodic orbits, even though they contain no subshifts of finite type with positive entropy; others are almost sofic – they contain subshifts of finite type with entropy arbitrari
APA, Harvard, Vancouver, ISO, and other styles
45

Chothia, Tom, Chris Novakovic, and Rajiv Ranjan Singh. "Calculating Quantitative Integrity and Secrecy for Imperative Programs." International Journal of Secure Software Engineering 6, no. 2 (2015): 23–46. http://dx.doi.org/10.4018/ijsse.2015040102.

Full text
Abstract:
This paper presents a framework for calculating measures of data integrity for programs in a small imperative language. The authors develop a Markov chain semantics for their language which calculates Clarkson and Schneider's definitions of data contamination, data suppression, program suppression and program transmission. The authors then propose their own definition of program integrity for probabilistic specifications. These definitions are based on conditional mutual information and entropy; they present a result relating them to mutual information, which can be calculated by a number of e
APA, Harvard, Vancouver, ISO, and other styles
46

Cates, Michael E., and Vinothan N. Manoharan. "Celebrating Soft Matter's 10th anniversary: Testing the foundations of classical entropy: colloid experiments." Soft Matter 11, no. 33 (2015): 6538–46. http://dx.doi.org/10.1039/c5sm01014d.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Anttila, Roope. "Local Entropy and Lq-Dimensions of Measures in Doubling Metric Spaces." PUMP Journal of Undergraduate Research 3 (November 4, 2020): 226–43. http://dx.doi.org/10.46787/pump.v3i0.2434.

Full text
Abstract:
We define restricted entropy and Lq-dimensions of measures in doubling metric spaces and show that these definitions are consistent with the monotonicity of Lq-dimensions. This provides a correct proof for a theorem considering the relationships between local entropy and Lq-dimensions in a paper by Käenmäki, Rajala and Suomala, the original proof of which makes use of a slightly erroneous proposition.
APA, Harvard, Vancouver, ISO, and other styles
48

Vershynina, Anna. "Quantum coherence, discord and correlation measures based on Tsallis relative entropy." Quantum Information and Computation 20, no. 7&8 (2020): 553–69. http://dx.doi.org/10.26421/qic20.7-8-2.

Full text
Abstract:
Several ways have been proposed in the literature to define a coherence measure based on Tsallis relative entropy. One of them is defined as a distance between a state and a set of incoherent states with Tsallis relative entropy taken as a distance measure. Unfortunately, this measure does not satisfy the required strong monotonicity, but a modification of this coherence has been proposed that does. We introduce three new Tsallis coherence measures coming from a more general definition that also satisfy the strong monotonicity, and compare all five definitions between each other. Using three c
APA, Harvard, Vancouver, ISO, and other styles
49

Barón-Hernández, José Alejandro, José Alfonso Baños-Francia, Peter Rijnaldus Wilhelmus Gerritsen, and Sandra Quijas. "Towards a Conceptual Approach on the Connections of Urban Metabolism and Entropy with the Human Habitat." World 5, no. 4 (2024): 1101–19. http://dx.doi.org/10.3390/world5040055.

Full text
Abstract:
The complexity of urban areas has motivated the search for integrative approaches. This paper addresses three topics—human habitat, urban metabolism, and urban entropy—to explore their links within the context of urban territory and sustainability. The lack of approaches, outlooks, and synergies motivates the search for an integrated conceptual framework, what I originated as a review of published works to contribute an interdisciplinary and multiscale outlook. From reviewing 41 articles, published from 1960 to 2020, definitions were extracted, original concepts were identified, synthetic defi
APA, Harvard, Vancouver, ISO, and other styles
50

KOČAN, ZDENĚK, VERONIKA KORNECKÁ-KURKOVÁ, and MICHAL MÁLEK. "Entropy, horseshoes and homoclinic trajectories on trees, graphs and dendrites." Ergodic Theory and Dynamical Systems 31, no. 1 (2010): 165–75. http://dx.doi.org/10.1017/s0143385709001011.

Full text
Abstract:
AbstractIt is known that the positiveness of topological entropy, the existence of a horseshoe and the existence of a homoclinic trajectory are mutually equivalent, for interval maps. The aim of the paper is to investigate the relations between the properties for continuous maps of trees, graphs and dendrites. We consider three different definitions of a horseshoe and two different definitions of a homoclinic trajectory. All the properties are mutually equivalent for tree maps, whereas not for maps on graphs and dendrites. For example, positive topological entropy and the existence of a homocl
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!