Entropy defect: Algebra and thermodynamics

We investigate the way the entropy of a system can be partitioned into the entropies of its constituents in consistency with thermodynamics. This partitioning is described through the concept of an entropy defect, which measures the missing entropy between the sum of entropies of a system's constituents and the entropy of the combined system; this decrease of entropy corresponds to the order induced by the additional long-range correlations developed among the constituents of the combined system. We conclude that the most generalized addition rule is the one characterizing the kappa entropy; when the system resides in stationary states, the kappa entropy becomes the one associated with kappa distributions, while, in general, this entropy applies more broadly, in stationary or nonstationary states. Moreover, we develop the specific algebra of the addition rule with entropy defect. The addition rule forms a mathematical group on the set of any measurable physical-quantity (e.g., entropy). Finally, we use these algebraic properties to restate the generalized zeroth law of thermodynamics so that it is applicable for nonstationary as well as stationary states: If a body C measures the entropies of two other bodies, A and B, then, their combined entropy is measured as the connected A and B entropy, where the entropy defect is involved in all measurements.

Introduction.-Analogous to the mass defect that arises when nuclear particle systems are assembled, the entropy defect measures the decrease of the entropy of the system compared to the sum of its constituent's entropies, caused by the order induced in the system through the additional correlations among its constituents [1][2][3][4][5].
The entropy of a system composed of two subsystems A and B, is given as a function of their entropies, S A and S B , respectively.This property is called composability and is fundamental to the foundation of thermodynamics.The composed entropy must be a function of (macroscopic) thermodynamic properties of the constituents.These properties include the entropy and thermal variables.The latter are well defined and determined only at stationary states.Then, for describing the general case of stationary or nonstationary states, we are only left with the entropy.Thus, the entropy of the composed system, denoted by A ⊕ κ B, S A⊕κB = f (S A , S B ).The symbol ⊕ κ refers to the κ-addition (see next section).The kappa entropy is related to the special function f (x, y) = x + y − (1/κ) • x • y, (a) E-mail: glivadiotis@princeton.edu(corresponding author) where the involved parameter kappa characterizes the kappa entropy; in the case of stationary states, this kappa entropy is associated with kappa distributions [6][7][8][9].As is shown [10][11][12], the only physically meaningful entropic form associated with the addition rule in eq. ( 1), is the kappa entropy.Kappa distributions emerge within the framework of statistical mechanics by maximizing the associated kappa entropy under the constraints of the canonical ensemble (e.g., [7,13]).Nevertheless, the entropy maximization cannot be considered as the thermodynamic origin of kappa distributions.Both the entropic and distribution functions can be equivalently derived from each other.There are a variety of mechanisms that can generate kappa distributions in particle systems [14].Some examples are: superstatistics [15][16][17][18][19][20][21][22][23][24][25], shock waves [26], turbulence [27][28][29][30], pickup ions that decrease the entropy of the system and stabilize it in lower kappa [4,31,32], velocity diffusion and other acceleration mechanisms [33], colloidal particles [34], radiation in plasmas [35], polytropic behaviour [4,9,36,37], etc.Even though a number of mechanisms exists with a potential to generate kappa distributions, and thus explain the presence of these distributions in systems such as space plasmas, there is a unique thermodynamic origin for all of them: the existence of correlations among the particles: first, independent analyses confirmed the existence of correlations induced by the presence of kappa distributions in the characterization of particles velocities [9,13,[38][39][40][41], and second, the reverse and by far more difficult analyses, confirmed that the existence of correlations can be connected only with the special formulation of kappa distributions [2,42,43].The question still remains: What is the thermodynamic origin of the kappa distribution, or equivalently, of its associated entropy?
The origin of the most generalized entropy capable of describing particle systems must be based on first principles of thermodynamics.In particular, it is interwoven with the possible ways the entropy of a system partitions into the entropies of the system's constituents, that is, to find the most general, thermodynamically consistent, function f (x, y) involved in the addition rule of eq. ( 1).
The most general form can be developed in at least three different and independent ways: a) Existence of a stationary state [12,44].The existence of stationary states is only possible when the entropy addition is most generally formulated, as follows: where we call the involved function, H = H(S), the "partitioning function".This function is not uniquely determined.However, it must be characterized by the following properties: i) positive; ii) monotonically increasing; iii) the reduction in entropy (S D ) is positive, i.e., S D (S A , S B ) ] > 0, for any S A and S B ; and iv) if H is kappa independent, then it coincides with the identity function H(S) = S; if it is kappa dependent, i.e., H = H(S; κ), then, at the classical limit where κ → ∞, it must restore the identity function H(S) = S (as shown in [12]).b) Trace-free entropic function of probabilities [11].Given a probability distribution in a discrete form, (with the subscript i = 1, 2, . . ., W , counting the energy states), then, the statistical definition of entropy is given by S (p 1 , . . ., p W ). If this is in trace-free form, then, it can be written as S ({p}) = {p} Φ(p).The derivative of entropy (with respect to a probability p) has the same functional form of probability for any state i, i.e., ∂S/∂p i = Φ (p i ); only then, the entropy maximization leads to a probability characterized by the same functional for any energy state, i.e., p i = F (ε i ), for all i's.Under this assumption, [11] showed that the entropy addition rule follows eq. ( 1).Consequently, the requirement comes again to the existence of stationarity, where the probability distribution is expressed in terms of energy states.c) Entropy defect [1,3].The concept of entropy defect does not require any assumption about the stationarity of the system (as in a)), or the statistical representation of the entropy formulation with respect to the probability distribution (as in b)).We have shown that the entropy addition rule with entropy defect that follows eq. ( 1) is consistent with thermodynamics [1-5] (fig.1).
The concept of entropy defect and its connection with thermodynamics is revisited in this letter.The entropy defect is used to investigate the possible ways the entropy of a system partitions into the entropies of the system's constituents, and is interwoven with the origin of the most generalized entropy formulation that must be based on first principles of thermodynamics.In this letter, we investigate all the possible addition rules characterizing the most general formulation of the entropy of a system that partitions into the entropies of its constituents.We follow the principles of entropy defect, because this concept does not require the existence of stationarity or a particular statistical representation of the entropy formulation with respect to the probability distribution.We then determine which of the possible addition rule(s) are consistent with thermodynamics and demonstrate that the most general is the known addition rule that characterizes the kappa entropy, that is, the entropy associated with kappa distributions.Moreover, this letter: i) develops the algebra of this addition rule, which forms a mathematical group on the set of any measurable physical quantity (e.g., entropy), ii) investigates its physical properties in regards to stationary and nonstationary states, and finally, iii) generalizes the zeroth law of thermodynamics to cover both stationary and nonstationary states alike.
Properties of the addition rule with entropy defect, the κ-addition.-The entropy defect S D involved in the general entropy partition equation or addition rule is characterized by the following three fundamental properties: 1) separability 3) upper boundedness, i.e., the existence of an upper limit of any entropy value, S ≤ κ, where the limit, κ, defines the magnitude of the interconnectedness (e.g., the particle correlations) between the system's constituents.An example of interconnectedness is the correlation characterizing space plasma particle velocities, where the correlation coefficient equals ρ = κ min /κ, with the minimum value of kappa, κ min , determined by half of the degrees of freedom, d/2 [39,45].
It was shown that the concept of entropy defect and its exact formulation can be entirely constructed axiomatically based on these three properties.Indeed, [3] demonstrated that, the properties can be inferred from reasonable physical aspects and the connection of entropy with thermodynamics.Alternatively, they can also be used as the basic axioms to define and describe the concept of entropy defect.
The first two properties of the entropy defect, i.e., separability and symmetry, are familiar to the classical understanding of thermodynamics.Indeed, the 1) initial independence and 2) interchangeability of systems are required for the separability and symmetry, respectively.However, the third property, that is, the existence of an upper bound, adds an entirely new feature to the framework of thermodynamics.The physical rationale is that the larger the entropy, the larger the entropy defect; but the entropy defect is bounded by the system's entropy, and thus, entropy must be also bounded.
Given the existence of an upper limit, s max , then, the proportionality redefining the involved s max -terms as the value of kappa, g 2 (s max ) /s max ≡ κ.
Finally, the combination of the subsystems A and B into the composed system with correlations, A ⊕ κ B, follows the addition rule denoted by ⊕ κ , simply called, the κaddition, recovering the standard addition in the case of no correlations or κ → ∞, S A⊕∞ B = S A +S B .Note that similar entropy addition and aspects of its algebra were previously considered [46][47][48][49], but here we investigate its physical perspective and consequences in thermodynamics.
The inverse operation: κ-subtraction.-The inverse operation of the κ-addition, that is, the κ-subtraction, physically means the deduction of some entropy from a system, characterized by correlations of magnitude 1/κ.
We denote the subtraction of S B from S A with S A⊕κB , which physically means the remaining entropy once the entropy S B has been removed from S A .Then, the difference of the κ-subtraction, S A⊕κB , can be determined through the inverse operation, namely, S A⊕κB equals the amount of entropy that has to be added (κ-addition) to entropy S B to get entropy S A , recovering the standard subtraction in the case of no correlations or for κ → ∞, The entropy S A⊕κ B constitutes the difference of entropies S A and S B , through the κ-subtraction, and is implicitly expressed by eq. ( 5).For instance, in the case where g is the identity function, g(x) = x, we have For any general g(x), S A⊕κ B can be determined by solving eq. ( 5) (i.e., finding the root via a typical iteration technique).
The κ-subtraction cancels out the kappa addition (inverse operation), i.e., S (A⊕κ B)⊕κ B = S A .This physically meaningful property is sufficient for the difference S A⊕κ B to be uniquely defined by eq. ( 5) (or, in other words, for eq.( 5) to have just one root of S A⊕κ B ). Indeed, if we substitute S A with S A⊕κ B in eq. ( 5), we have ), which is identical to eq. ( 4), so surely, one solution is S (A⊕κ B)⊕κ B = S A , while there should be no other roots, otherwise for those roots, it would have been S (A⊕κ B)⊕κ B = S A .
Properties of the function g. -We develop the properties of the function g involved in the κ-addition.
1) Zero at zero.No entropy defect for no entropy, i.e., it must be S A⊕κ B = S A when S B = 0, thus, g(0) = 0.
2) Slope 1 at zero, g (σ S) ∼ = σ.We set S B → σ, S A → S, and dS ≡ S ⊕ κ σ −S; then, the addition rule of these entropies gives S ⊕ κ σ = S + σ − 1 κ g(S)g (σ); for small entropy σ: where S ∞ ≡ S (κ → ∞).Then, by integrating we find the relationship that connects the entropy of a system S characterized with correlations and kappa value κ with the entropy of a system as if there were no correlations, S ∞ .In the whole analysis, g (0) appears always as a fraction with kappa, g (0)/κ; due to this universality, we can set g (0)/κ ≡ 1/κ redefining kappa or g (0) = 1.

3) Monotonically increasing function.
There is an "Entropic Cause-Effect", where zero entropy causes zero entropy defect, but also, the larger the entropy, the larger the entropy defect; thus, g(S) is a monotonically increasing function, g (S) > 0.
The above properties help to identify the actual form of g.
Identification of the function g. -Here we determine the function g, which is involved in the entropy defect formulation shown in eq. ( 4).We will show that this is given by the identity function, g(x) = x.First, [3] showed the form of the function g(x) together with the existence of an entropy upper limit.In summary, given S A⊕κB ≥ S A , we find that . The limit of S A must be independent of S B , which is true only for g being the identical function, g(S) = S (see next section).
Interestingly, the form of the function g(x) can be also found from the entropic boundary.The lower limit of an entropy value is half the degrees of freedom, S ≥ 1 2 d; this can be easily deduced from the form of the Sackur-Tetrode equation of entropy for d-degrees of freedom [3].Then, setting S A = S B = 1 2 d, the combined system has entropy S A⊕κ B = d in the absence of correlations (κ → ∞), or S A⊕κ B ≤ d, in the presence of correlations (κ < ∞), and applying eq. ( 4), we have which includes as limiting case, the equality.Therefore, the left-hand function of 1 2 d, i.e., g 1 2 d 2 / 1 2 d , represents exactly the minimum value of kappa, which is also 1 2 d (as shown by [39,45]): The strict proof for the form of the function g comes from the requirement to have consistent constitution of a system with correlations among its constituents.The entropy of combining A and B should be the same when combining A ⊕ κ c with B ⊕ κ c, with c representing any elementary constituent with infinitesimal entropy σ (see fig. 2).
Two equivalent representations of the whole system are: A⊕ κ B and (A ⊕ κ c)⊕ κ (B ⊕ κ c); the entropy of the system can be derived from either one of the two representations: ) and ( 9) For the involved entropies in eq. ( 10), we obtain the following expansions in first order of σ.The κ-addition gives S A⊕κ c = S A + σ − 1 κ g (S A ) g (σ), hence, Substituting eqs.(11a) and (11b) in eq. ( 10), we obtain , with (12a) ] and (12b) Requiring S (A⊕κ c)⊕κ(B ⊕κ c) = S A⊕κ B , hence, I (A,B) = 0, we obtain where the only way eq.( 13) may hold is when the function J(S) is a constant; this is noted by const ≡ c • 1 κ .Hence, we find the 1st-order differential equation: which is consistent with the properties of g when c = 1, i.e., g (S) = 1 or g(S) = S, i.e., the identity function.
Boundedness.-The existence of an upper limit of entropy can be shown in the case of an arbitrary function g in the entropy defect, S D = 1 κ g (S A ) g (S B ). Adding an elementary constituent of a small amount of entropy σ into the system of entropy S (the cause) generates additional correlations and the entropy defect • σ (the effect).In entropy units, the effect is less than or equal to the cause, namely, the entropy defect is a part of the added entropy, S D ≤ σ.In general, adding entropy S B to S A causes the entropy defect of This entropic cause and effect property is equivalent to the monotonically increasing entropy of systems, i.e., S A⊕κ B ≥ S A or S A⊕κ B ≥ S B .This leads to the upper limit of entropy, i.e., S D = The limit of S A must be independent of S B , which is true only for g(S) = S, and thus, S A ≤ κ.
Inequality S A⊕κ B ≥ S A is consistent with the subtraction, in eq. ( 5).In particular, in the subtraction of ), all the involved entropies are non-negative, S B ≥ 0, S C ≥ 0, S C⊕κ B ≥ 0, while S C ≥ S C⊕κ B ; then, setting C as A ⊕ κ B, we end up with S A⊕κ B ≥ S A .
There are several other ways to show the upper limit of entropy, once it is established that the function involved in the entropy defect is the identity, g(S) = S.These are as follows: In [3], we provided the summation of entropies S A and S B as S A⊕κ B /(S A S B ) = 1/S A +1/S B − 1 κ , and argued that if S A , for instance, is unbounded, then it may be large enough to have 1/S A → 0, and then, 1/S B − 1 κ ≥ 0 or S B ≤ κ, because S A⊕κ B ≥ 0. Hence, even though S A and S B are the entropies of originally independent systems, the entropy of the one system has been bounded (S B ≤ κ), because of the unbounded value of the other system (1/S A → 0); the only way the systems may have entropies that behave independently is to all share the same upper limit h max ; indeed, substituting S B → h max and S A⊕κ B → h max in S A⊕κ B = S A + S B − 1 κ S A S B , we found h max = κ.Here, we provide yet another way for showing the existence of an upper limit and that is S ≤ κ, again, when g(S) = S. Any entropy S can be separated in two equal parts, S 1/2 , whose addition rule under the entropy defect gives S = 2S 1/2 − 1 κ S 1/2 2 , described by the parabola y = 2x−x 2 , where y ≡ 1 κ S and x ≡ 1 κ S 1/2 .Independently of the value of x, there is an upper limit of the value of y, y ≤ 1, or the values of the system's entropy, S ≤ κ.
Algebra of the κ-addition.-A quantity X has an addition rule the one of entropy (with entropy defect), if where the sign ⊕ κ denotes the specific addition rule in eq. ( 15), called κ-addition.As we will show, this addition operating on the values of X constitutes an Abelian group, in which the law of composition is commutative.
The κ-addition of any values of the quantity X, such as, X A and X B in eq. ( 15), constitutes another value of quantity X, that is, the one that characterizes the combined system A ⊕ κ B. Here, we derive the algebra characterizing the operation of κ-addition, and its basic algebraic properties.Furthermore, we simplify our approach by considering i) dimensionless quantities, that is, the quantity X normalized to its maximum, κ, X/κ, i.e., (

and ii) keeping only the systems' symbols, i.e., A ⊕
Hence, we have the following algebraic properties: 1) Consistent range values: 2) Operation between any two elements (κ-addition): which can be expressed in the convenient product form: 3) Associative property of addition: 4) Identity (or zero) element: where zero entropy also means zero entropy defect.5) Inverse element: where the κ-addition between an element and its inverse gives zero.Hence, the inverse is given by 6) Commutative (or Abelian) property of addition: The above properties of associativity, the existence of an identity element and an inverse of every element define the group, while the commutativity means that it is an Abelian.Hence, the operation of the addition including the entropy defect acting upon the values of X (e.g., entropies) is an Abelian group.7) Subtraction, i.e., addition with the inverse element: 8) Combination of subtraction and addition is the identity: ) Constitution of systems: Take the combined system of A and B, according to eq. (17a).Then, any part C of the combined system (A⊕ κ B), can be thought as being taken out of B and added to A, before A and B are finally combined: (This can be shown by using previous properties.)10) Connection between systems A and B: The connection and exchange of information between systems in regards to a physical quantity X, requires the comparison of the respective values of this quantity.The connection of A to B means to measure X of B from A, that is, finding X B⊕κ A , or in short, B ⊕ κ A.
Properties of the connection measurements: a) Symmetry.The measurements of B from A and of A from B are inverses of each other: (25a) b) Transitivity.A measurement of B from A is the sum (κ-addition) of measurements from any transit points between A and B (i.e., path-independence).For a transit point, C: Zeroth law of thermodynamics: generalization for stationary and nonstationary states.-The zeroth law of thermodynamics states (e.g., [50,51]) that: If a body C is in thermal equilibrium with two other bodies, A and B, then A and B are in thermal equilibrium with one another.This has been broadened to hold for the generalized thermal equilibrium described by kappa distributions [31]: If a body C is in a (thermal) stationary state with two other bodies, A and B, then A and B are in the same (thermal) stationary state with one another.
The zeroth law of thermodynamics is naturally a transitive thermodynamic property of systems (a relation R on a set X is transitive if, for all elements A, B, C in X, whenever R relates A to B and B to C, then R also relates A to C).In addition, the zeroth law of thermodynamics is also a symmetric property, i.e., If A is in a (generalized) thermal equilibrium with B, then B is in the same (generalized) thermal equilibrium with A.
The transitive and symmetric properties of the zeroth law of thermodynamics clearly conjure the matter of connection between various systems: The way that C is connected to A and B provides all the information on how A and B are connected.If, for instance, C is in a stationary state with A and with B, then, A and B are in a stationary state with each other.Stationarity is a typical example; (thermal) stationary states are obtained when the entropy is maximized, and thus, is invariant with time.
Here we use the algebra of the addition rule with entropy defect, in order to revisit the transitive and symmetric properties, and finally, restate the zeroth law of thermodynamics without needing the limitation of stationarity.
We consider two systems, having entropies S A and S B , which can be connected directly or through any number of transit points.The connection of A to B means to have measured the value of entropy of B from A, that is, finding B⊕ κ A, or S B,A ≡ S B ⊕ κ S A .The two properties characterizing the connection measurements are the 1) symmetry, and 2) transitivity (path-independence), i.e., S B,A ⊕ κ S A,B = 0, or S B,A = S A,B and S A,B = S B,A , (26a) respectively, which are expanded to and classically (for κ → ∞) recover For stationary states, the zeroth law describes securing the validity of the zeroth thermodynamic law for any stationary state (with finite or infinite kappa).Most importantly, the properties of symmetry and transitivity hold in eqs.(29a) and (29b) and connect the entropies of systems, either for stationary and nonstationary states.Therefore, the zeroth law of thermodynamics can be restated in terms of entropy difference: If a body C is connected with two other bodies, A and B, and measures no entropy difference between A and B, then, if A and B are connected together, there will be no entropy difference between them.
In the above, the case of zero entropy difference applies to stationary states.We can now provide the most generalized version of the zeroth law of thermodynamics, holding in any stationary or nonstationary states (fig.3): If a body C measures the entropies of two other bodies, A and B, then, their combined entropy is measured as the connected A and B entropy, where the entropy defect is involved in all measurements.
Hence, if a body C is connected with two other bodies, A and B, and measures entropies S A,C and S B,C , respectively, then, if A and B are connected together, they will measure entropy given by S B,A = S B,C ⊕ κ S A,C and S A,B = S A,C ⊕ κ S B,C .This modern version of the zeroth law is complete, as it i) characterizes constitutions of bodies or systems in connections (thus, information can be transferred), ii) is in accordance to both the symmetry and transitivity properties, iii) involves both stationary and nonstationary states, and iv) stresses the fact that the entropy difference between systems is independent of the transit points and paths connecting them.
Conclusions.-In this study, we investigate the possible ways the entropy of a system partitions into the entropies of its constituents, which are consistent with thermodynamics.For this purpose, we employed the concept of entropy defect.The entropy defect measures the missing entropy between the sum of entropies of a system's constituents and the entropy of the combined system.This decrease of entropy corresponds to the order induced by the additional long-range correlations developed among the constituents of the ensembled combined system.We use the path of entropy defect, since this is unique for its i) simplicity, ii) implementation in fundamental physics, and iii) lack of assumptions, e.g., it does not require the existence of stationarity or special entropy formulations.
The generalized entropy formulation of systems with correlations is interwoven with the addition rule of entropy partition.We concluded that the most generalized addition rule is the one characterizing the kappa entropy, i.e., the entropy associated with kappa distributions.
We developed the specific algebra of the addition rule with entropy defect.The operation of the addition rule forms a mathematical group on a set of measurable physical quantities (e.g., entropy).Finally, the algebraic properties have been used to restate the generalized zeroth law of thermodynamics.This is described through both the symmetric and transitive properties that characterize the entropies among systems' connections in both stationary and nonstationary states.A concise expression of this law was given by the entropy difference among connections.This modern adaptation of the zeroth law stresses the fact that the entropy difference between systems is independent of the transit points and paths connecting them.
With the results of this study, the kappa entropy, the one associated with the kappa distributions, constitutes the correct formulation that must be used for describing the particle velocities in space plasmas throughout the heliosphere and inner heliosheath, and thus for analyzing data from the IBEX and IMAP missions.

Fig. 1 :
Fig.1: Mass vs. entropy defect.Analogous to the mass defect (MD) that quantifies the missing mass (energy) associated with assembling subatomic particles, the entropy defect (SD) quantifies the missing entropy (order) associated with assembling plasma particles or other elements characterized with correlations.

Fig. 2 :
Fig. 2: Constitution of a system with correlations, assembled either from A and B (A⊕κ B) or from A⊕κ c (A plus additional part c) and B ⊕ κ c (B without c).

Fig. 3 :
Fig. 3: The zeroth law of thermodynamics describes the entropy difference between connected systems, based on the two properties of symmetry and transitivity.