Abstract
The lack of knowledge that an observer has about a system limits the amount of work it can extract. This lack of knowledge is normally quantified using the Gibbs/von Neumann entropy. We show that this standard approach is, surprisingly, only correct in very specific circumstances. In general, one should use the recently developed smooth entropy approach. For many common physical situations, including large but internally correlated systems, the resulting values for the extractable work can deviate arbitrarily from those suggested by the standard approach.
Export citation and abstract BibTeX RIS
GENERAL SCIENTIFIC SUMMARY Introduction and background. The less one knows about a thermodynamical system, the less work one can extract from it. The lack of knowledge is normally quantified by the von Neumann entropy. However, information theory has gone beyond the simplifying assumptions associated with the von Neumann entropy. It has recently been shown that in more general scenarios one should in fact use a different entropy measure, dubbed the smooth entropy. This reduces to the von Neumann entropy in certain limits, but can in general be arbitrarily different from it. As the von Neumann entropy is widely used not only in the information theoretic setting, but also in thermodynamics, the smooth entropy approach may also have implications for the latter.
Main results. We show that one should in fact replace the von Neumann entropy with the smooth entropy in certain standard thermodynamical expressions. We thereby recover expressions that are more generally valid, in particular for systems with internal correlations. We recover the von Neumann entropy in the limit of infinitely many systems without internal correlations, but in general it is inadequate for quantifying the relation between work and information and should be replaced by smooth entropy.
Wider implications. This paper is a first step towards overcoming certain simplifying but ultimately limiting assumptions of statistical mechanics and shows that smooth entropies are a powerful tool to achieve that.