This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Paper The following article is Open access

Bounds on current fluctuations in periodically driven systems

, , and

Published 18 October 2018 © 2018 The Author(s). Published by IOP Publishing Ltd on behalf of Deutsche Physikalische Gesellschaft
, , Citation Andre C Barato et al 2018 New J. Phys. 20 103023 DOI 10.1088/1367-2630/aae512

Download Article PDF
DownloadArticle ePub

You need an eReader or compatible software to experience the benefits of the ePub3 file format.

1367-2630/20/10/103023

Abstract

Small nonequilibrium systems in contact with a heat bath can be analyzed with the framework of stochastic thermodynamics. In such systems, fluctuations, which are not negligible, follow universal relations such as the fluctuation theorem. More recently, it has been found that, for nonequilibrium stationary states, the full spectrum of fluctuations of any thermodynamic current is bounded by the average rate of entropy production and the average current. However, this bound does not apply to periodically driven systems, such as heat engines driven by periodic variation of the temperature and artificial molecular pumps driven by an external protocol. We obtain a universal bound on current fluctuations for periodically driven systems. This bound is a generalization of the known bound for stationary states. In general, the average rate that bounds fluctuations in periodically driven systems is different from the rate of entropy production. We also obtain a local bound on fluctuations that leads to a trade-off relation between speed and precision in periodically driven systems, which constitutes a generalization to periodically driven systems of the so called thermodynamic uncertainty relation. From a technical perspective, our results are obtained with the use of a recently developed theory for 2.5 large deviations for Markov jump processes with time-periodic transition rates.

Export citation and abstract BibTeX RIS

Original content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

1. Introduction

Thermodynamics [1] is a major branch of physics concerned with the limits of operation of machines that transform heat into other forms of energy. This theory is limited to macroscopic systems such as a steam engine. However, the way heat and temperature relate to other forms of energy is also important for small nonequilibrium systems, such as molecular motors and colloidal heat engines. For such systems, thermal fluctuations are relatively large and they cannot be ignored.

Stochastic thermodynamics [2] generalizes thermodynamics to small nonequilibrium systems. A major question that arises within this theoretical framework that takes fluctuations into account is: what are the universal relations that rule fluctuations in small nonequilibrium systems? The fluctuation theorem is one such relation [38], it is a constraint on the probability distribution of entropy that generalizes the second law of thermodynamics.

A more recent universal relation associated with such fluctuations is the thermodynamic uncertainty relation from [9]. This relation establishes that precision of a thermodynamic current, such as the number of consumed ATP or the displacement of a molecular motor, has a minimal universal energetic cost. Possible applications of the thermodynamic uncertainty relation include the inference of enzymatic schemes in single molecule experiments [10], a bound on the efficiency of molecular motors that depends only on fluctuations of the displacement of the motor [11], a universal relation between power and efficiency for heat engines in a stationary state [12], and design principles in nonequilibrium self-assembly [13].

The thermodynamic uncertainty relation is a consequence of a more general bound on the full spectrum of current fluctuations [14, 15]. Using large deviation theory [1619], this bound is expressed as a parabola that is above the so called rate function, which quantifies the rate of exponentially rare events. A key feature of this parabolic bound is that it depends solely on the average entropy production and the average current, i.e., knowledge of the average entropy production and the average current implies a bound on arbitrary fluctuations of any thermodynamic current. There has been much recent work related to this universal principle about current fluctuations [2037].

The parabolic bound applies to stationary states of Markov processes with time-independent transition rates. Physically, this situation corresponds to systems that are driven by fixed thermodynamic forces, e.g., molecular motors driven by the free energy of ATP hydrolisis. Another major class of thermodynamic systems away from equilibrium is that of periodically driven systems, which can be described as Markov processes with time-periodic transition rates. Two experimental realizations of periodically driven systems are Brownian heat engines [38] and artificial molecular pumps [39].

There is a fundamental difference with respect to fluctuations between systems driven by a fixed thermodynamic force and periodically driven systems. As shown in [40], for a periodically driven system, the energetic cost of precision of a thermodynamic current can be arbitrarily small, in stark contrast to systems driven by a fixed thermodynamic force, for which this precision has a minimal universal cost, as determined by the thermodynamic uncertainty relation. Hence, the parabolic bound from [14, 15] that depends on the average rate of entropy production does not apply to periodically driven systems. For the particular case of a time-symmetric protocol, a derivation of a thermodynamic uncertainty relation has been proposed in [29]. The relation between these two classes of nonequilibrium systems is also relevant for the mapping of artificial molecular machines, which are often driven by an external periodic protocol (see [41] for a counter-example), onto biological molecular motors, which are autonomous machines driven by ATP, as discussed in [42, 43].

In this paper, we obtain a universal bound on current fluctuations in periodically driven systems that is also parabolic. For the particular case of a current with increments that do not depend on time, such as internal net motion in a molecular pump, our bound depends on a single average rate. However, this average rate is different from the entropy production. For a constant protocol that leads to time-independent transition rates, our bound becomes an even more general bound than the known bound for stationary states from [14, 15]. A relevant technical aspect of our proof is as follows. The parabolic bound for stationary states has been proved in [15]. This proof uses a remarkable result for large deviations in Markov processes, i.e., the exact form of the rate function for 2.5 large deviations for stationary states [4447]. More recently, the rate function of 2.5 large deviations for time-periodic transition rates has been obtained in [48]. We use this result to prove our bounds.

Similar to the parabolic bound for stationary states that implies the thermodynamic uncertainty relation, our global bound on large deviations leads to a trade-off relation between speed and precision in periodically driven systems. We obtain a tighter local bound on the rate function that leads to an improved trade-off relation between speed and precision. For the case of stationary states, this bound is also tighter then the bound determined by the thermodynamic uncertainty relation.

We also prove our results for the case of a cyclic stochastic protocol [40, 49, 50]. Such protocols are convenient to perform illustrative calculations with specific models. Furthermore, the proofs for stochastic protocols are a generalization of our results for deterministic protocols, since current fluctuations for a stochastic protocol with an infinitely large number of jumps are equivalent to current fluctuations for a deterministic protocol [50].

The paper is organized in the following way. In section 2 we define the basic mathematical quantities and physical models. In section 3, we introduce and illustrate our main results for the case of currents with time-independent increments. The bounds are derived in section 4. We conclude in section 5. The appendix contains the proofs for the case of a stochastic protocol.

2. Mathematical preliminaries and physical models

2.1. Markov processes with time-periodic transition rates and fluctuating observables

We consider a Markov jump process with finite number of states Ω. The space of states is written as {1, 2, ..., Ω}. The transition rate from state i to state j at time t is denoted by wij(t). Since we are interested in periodically driven systems, these transition rates have a period τ, i.e., wij(t) = wij(t + τ). Furthermore, we assume that if ${w}_{{ij}}(t)\ne 0$ then ${w}_{{ji}}(t)\ne 0$.

The master equation that governs the time-evolution of Pi(t), the probability to be in state i at time t, reads

Equation (1)

In the long time limit, Pi(t) tends to an invariant time-periodic distribution πi(t) = πi(t + τ). An important quantity in this paper is the average elementary current

Equation (2)

Fluctuations can be analyzed if we consider stochastic variables that are defined as functionals of a stochastic trajectory ${({a}_{t})}_{0\leqslant t\leqslant m\tau }$, where is the final time and m is an integer number. This trajectory is a sequence of jumps and waiting times. If a jump takes place at time t, the state of the system before and after the jump is denoted by ${a}_{t}^{-}$ and ${a}_{t}^{+}$, respectively. Two basic fluctuating quantities are

Equation (3)

and

Equation (4)

where dt is an infinitesimal time interval and t ∈ [0, τ]. The empirical density ${\rho }_{i}^{(m)}(t)$ counts the fraction of periods with the system in state i at time t. The empirical flow ${C}_{{ij}}^{(m)}(t)$ counts the number of jumps per period from i to j at time t. Even though both quantities are functionals of the stochastic trajectory, to simplify notation, we do not keep the explicit dependence on ${({a}_{t})}_{0\leqslant t\leqslant m\tau }$. The fluctuating empirical current from state i to state j is given by

Equation (5)

The average in equation (2) is

Equation (6)

where the brackets denote an average over stochastic trajectories.

A generic current ${J}_{\alpha }^{(m)}$ is defined by its periodic increments αij(t), which are anti-symmetric, i.e., αij(t) = − αji(t), as

Equation (7)

where ${\sum }_{i\lt j}$ represents a sum over all pairs of states (i, j) with $i\lt j$ and with non-zero transition rates. The current in equation (7) can also be written in the form

Equation (8)

In stochastic thermodynamics, physical observables such as heat fluxes and particle fluxes are expressed as currents ${J}_{\alpha }^{(m)}$. The average rate associated with ${J}_{\alpha }^{(m)}$ in the limit $m\to \infty $ reads

Equation (9)

Furthermore, the diffusion coefficient associated with ${J}_{\alpha }^{(m)}$ is defined as

Equation (10)

An important current in stochastic thermodynamics is the entropy increase of the environment [2], which corresponds to the increments ${\alpha }_{{ij}}(t)=\mathrm{ln}\tfrac{{w}_{{ij}}(t)}{{w}_{{ji}}(t)}$. The average rate of entropy production is then given by

Equation (11)

The second equality follows from πi(t) = πi(t + τ) and from equation (1), which leads to ${\partial }_{t}{\pi }_{i}+{\sum }_{j\ne i}{{ \mathcal J }}_{{ij}}=0$.

2.2. Large deviations

The rate function from large deviation theory quantifies exponentially rare events in the long time limit [1619]. It is defined through the relation

Equation (12)

where the symbol ∼means asymptotic equality in the limit $m\to \infty $ and ${J}_{\alpha }^{(m)}\approx x$ means that ${J}_{\alpha }^{(m)}$ lies in an infinitesimal interval around x. Our main result is a parabola that bounds ${I}_{\alpha }(x)$, which is a convex function, from above. This parabola depends on an average rate. For the known parabolic bound for stationary states from [14, 15], this rate is the average rate of entropy production σ in equation (11). In our bound for periodically driven systems, this rate is, in general, different from σ.

Current fluctuations can also be characterized by the scaled cumulant generating function

Equation (13)

where z is a real number. The cumulants associated with ${J}_{\alpha }^{(m)}$ can be obtained as derivatives of λα(z) at z = 0. The scaled current generating function λα(z) is a Legendre–Fenchel transform of the rate function Iα(x), i.e.

Equation (14)

If a parabola bounds ${I}_{\alpha }(x)$ from above then a corresponding parabola, which can be determined from equation (14), bounds λα(z) from below. For illustrations of our results we perform calculations of λα(z) using known methods [40, 50].

2.3. Stochastic protocol

We also consider the case of an external protocol that is stochastic [40, 49, 50]. In order to mimic a deterministic periodic protocol, this stochastic protocol is cyclic and has N states. The transition rate from state i to state j with the external protocol in state n = 0, 1, ..., N − 1 is denoted by wijn. The transition rate for a change in the external protocol from state n to state $n+1\,\mathrm{mod}\,N$ is γ, whereas the transition rate for the reversed transition is 0. Consider a deterministic periodic protocol characterized by the rates wij(t) and the period τ. If the rates of the model with a stochastic protocol are ${w}_{{ij}}^{n}={w}_{{ij}}(t=n\tau /N)$ and γ = N/τ, then in the limit of $N\to \infty $, current fluctuations for the stochastic protocol become equal to current fluctuations for the deterministic protocol [50]. Hence, the deterministic protocol corresponds to an asymptotic limit of a stochastic protocol. We point out that we do not consider the cost of the external protocol [51].

In the appendix, we derive bounds on current fluctuations for the case of a stochastic protocol. These derivations are similar to the derivation in section 4 for a deterministic periodic protocol. An advantage of models with a stochastic protocol is that they are Markov processes with time-independent transition rates, which can simplify the exact evaluation of the scaled cumulant generating function in equation (13), as explained in [40]. Whereas the expressions in the main text are for the case of a deterministic protocol, the expressions for a stochastic protocol can be obtained from these expressions for a deterministic protocol by making the substitution ${\tau }^{-1}{\int }_{0}^{\tau }{\rm{d}}{t}\to {N}^{-1}{\sum }_{n}$, as explained in the appendix.

2.4. Case studies

2.4.1. Colloidal particle driven by a time-periodic field

The first model in figure 1(a) is a biased random walk on a ring with Ω states driven by a time-periodic force $F(t)\equiv {F}_{0}\cos (2\pi t/\tau )$. A physical realization of this model is a charged colloid on a ring subjected to a time-periodic electrical field. We set Boltzmann constant kB and the temperature T to kBT = 1 throughout. The transition rate for a jump in the clockwise direction is ${k}_{+}(t)\equiv k{{\rm{e}}}^{F(t)/{\rm{\Omega }}}$ and the reversed transition rate is ${k}_{-}(t)\equiv k$. These transition rates satisfy the generalized detailed balance relation [2]. The current we consider is the net number of jumps in the clockwise direction per unit time. For this model, the scaled cumulant generating function in equation (13) can be calculated exactly [50].

Figure 1.

Figure 1. Case studies. (a) Biased random walk with time-periodic force F(t). (b) Model for a molecular pump. The red square represents the energy E1, the blue hexagon represents the energy E2, and the magenta circle represents the energy E3. The red solid bar represents the energy barrier B1, the blue dashed bar represents the energy barrier B2, and the dotted magenta bar represents the energy barrier B3. The green arrows represent transitions that change the state of the protocol. (c) Representation of the network of states of the model with 4 states and two independent thermodynamic forces that depend on the state of the external protocol n.

Standard image High-resolution image

2.4.2. Molecular pump

The other two models are driven by a stochastic protocol. The model illustrated in figure 1(b) is a molecular pump with Ω = 3. This model has been introduced in [40]. The external protocol changes energies and energy barriers between states, which can lead to net rotation in the ring with three states. The number of states of the external protocol is N = 3. The states of the external protocol are denoted by 0, 1, 2, which correspond, respectively, to the top left circle, the top right circle and the bottom circle in figure 1(b). In this model, the energies and energy barriers are rotated in the clockwise direction by one step if a jump (with rate γ ) that changes the state of the protocol takes place. The energies are denoted by E1, E2, and E3, whereas the energy barriers are denoted by B1, B2, and B3. The internal transition rates are given by

Equation (15)

for j = i + 1, and

Equation (16)

for j = i − 1, where we assume periodic boundary conditions. An important property of molecular pumps is that the thermodynamic force is zero for any state n of the external protocol. This physical condition is manifested in the following restriction on the transition rates

Equation (17)

The current we consider is the net number of jumps in the clockwise direction per unit time. The scaled cumulant generating function in equation (13) associated with this current can be calculated from the eigenvalue of a modified generator, as shown in [40].

2.4.3. Enzymatic reaction with stochastic substrate concentrations

The model illustrated in figure 1(c) is a model with Ω = 4 and two independent thermodynamic forces F1n and F2n, which depend on the state of the external protocol n. This model can be interpreted as a enzyme that can consume two different substrates and produces one product [9]. The two enzymatic cycles are $E+{S}_{1}\to {{ES}}_{1}\to {EP}\to E+P$ and $E+{S}_{2}\to {{ES}}_{2}\to {EP}\to E+P$, where E is the enzyme, P is the product, S1 is one substrate, and S2 is another substrate. State 1 corresponds to the free enzyme E, state 2 corresponds to ES1, state 3 corresponds to ES2, and state 4 corresponds to EP. The external control of the concentrations of the substrates S1 and S2 generate thermodynamic forces that depend on n. The number of states of the external protocol is N = 2. The generalized detailed balance relation for this model reads

Equation (18)

The thermodynamic forces change between two values of the same modulus and different sign stochastically, i.e., F1n is given by ${F}_{1}^{0}={F}_{1}$ and ${F}_{1}^{1}=-{F}_{1}$, whereas F2n is given by ${F}_{2}^{0}={F}_{2}$ and ${F}_{2}^{1}=-{F}_{2}$.

The transition rate for a change of the external protocol is γ. The transitions rates are set to ${w}_{12}^{n}=k{{\rm{e}}}^{{F}_{1}^{n}/2},{w}_{13}^{n}=k{{\rm{e}}}^{{F}_{2}^{n}/2},{w}_{14}^{n}=k$, ${w}_{21}^{n}=k,{w}_{24}^{n}=k{{\rm{e}}}^{{F}_{1}^{n}/2},{w}_{31}^{n}=k,{w}_{34}^{n}=k{{\rm{e}}}^{{F}_{2}^{n}/2}$, ${w}_{41}^{n}=k,{w}_{42}^{n}=k$, and ${w}_{43}^{n}=k$. The current we consider is the elementary current from state 1 to state 2, which corresponds to the net number of S1 molecules that have been consumed per unit time. As is the case of the previous model, the scaled cumulant generating function in equation (13) can be calculated with the method explained in [40].

3. Main results

In this section we discuss our main results for currents with time-independent increments αij(t) = αij, which include the case of currents generated in a molecular pump. For time-independent increments, the results acquire a simpler form with a more direct physical interpretation. In section 4, we present proofs of more general results, which, inter alia, also hold for currents with time-dependent increments. Physical examples of currents with time-dependent increments include the heat and work currents in heat engines (see [49] for general definitions of these currents). The general features of our main results presented in this section are the same irrespective of whether the protocol is deterministic or stochastic, which is discussed in appendix.

3.1. Global bound

The parabolic bound on the rate function is

Equation (19)

where

Equation (20)

and

Equation (21)

The inequality ${\sigma }^{* }\geqslant 0$ comes from the fact that for fixed t every term in the sum ${\sum }_{i\lt j}$ in equation (20) is not negative. In general, the average rate ${\sigma }^{* }$ is different from the thermodynamic rate of entropy production σ in equation (11). Furthermore, there is no simple inequality relating both quantities, as illustrated in figure 2(b).

Figure 2.

Figure 2. Illustration of the bound. (a) The function ${\tilde{\lambda }}_{\alpha }(\tilde{z})$ in equation (24) for the models from figure 1, as indicated in the legends, compared to the lower bound $\tilde{z}(1+\tilde{z})$. The parameters for the model represented in figure 1(a) are set to F0/Ω = 2 and k = τ = 1. The parameters for the model represented in figure 1(b) are set to E1 = E3 = B1 = B2 = 0, E2 = 2, B3 = 5, and γ = 1/10. The parameters for the model represented in figure 1(c) are set to F1 = 2, F2 = 1/2, k = 1, and γ = 1/10. (b) Comparison between the rate of entropy production σ, the rate ${\sigma }^{* }$ and the rate $\tilde{\sigma }$, for the model in figure 1(b) with parameters ${E}_{2}=2,{E}_{3}=-5,{B}_{1}=-5,{B}_{2}=2,{B}_{3}=0$, and $\gamma ={{\rm{e}}}^{2}$. The parameter E1 is the variable in the horizontal axis.

Standard image High-resolution image

For the case of time-independent transition rates ${w}_{{ij}}(t)={w}_{{ij}},{\sigma }^{* }=\sigma $ and the bound (19) becomes

Equation (22)

This bound is the known parabolic bound for time-independent transition rates proved in [15]. Hence, equation (19) constitutes a generalization of this parabolic bound to periodically driven systems.

In terms of the scaled cumulant generating function, the bound in equation (19) is written as

Equation (23)

where we used equation (14). The universality of our result is illustrated in figure 2(a). There we compare the function

Equation (24)

where $\tilde{z}\equiv z{{ \mathcal J }}_{\alpha }/{\sigma }^{* }$, for the models in figure 1, with the lower bound $\tilde{z}(1+\tilde{z})$. This bound, or the bound in equation (19), is a particular case of two bounds, one derived in section 4.1 and the other derived in section 4.5.

3.2. Trade-off between speed and precision

Taking the second derivative of Iα(x) at $x={{ \mathcal J }}_{\alpha }$, we obtain the diffusion coefficient Dα defined in equation (10) as

Equation (25)

The inequality in equation (19) and the fact that this inequality is saturated at $x={{ \mathcal J }}_{\alpha }$, leads to the following bound on Dα,

Equation (26)

In section 4.3, we derive a local quadratic bound on ${I}_{\alpha }(x)$, which is valid for x close to the average ${{ \mathcal J }}_{\alpha }$. This local bound together with equation (25), gives a tighter bound on Dα that reads

Equation (27)

where

Equation (28)

The second inequality in equation (27) is a consequence of ${\sigma }^{* }\geqslant \tilde{\sigma }$, which follows from the inequality

Equation (29)

where a and b are positive. An inequality similar to ${\sigma }^{* }\geqslant \tilde{\sigma }$ has been considered in [52]. We point out that there is no general inequality between the entropy production σ and the rate $\tilde{\sigma }$, as illustrated in figure 2(b).

Rearranging the terms in equation (27), we write the following universal trade-off relation between speed and precision for periodically driven systems,

Equation (30)

where ${{ \mathcal F }}_{\alpha }\equiv 2{D}_{\alpha }/{{ \mathcal J }}_{\alpha }$ is the Fano factor. The Fano factor characterizes the precision associated with ${J}_{\alpha }^{(m)}$, whereas ${{ \mathcal J }}_{\alpha }$ quantifies the speed. In periodically driven systems, a current with small fluctuations, as characterized by a small Fano factor ${{ \mathcal F }}_{\alpha }$, can only be as fast as $\tilde{\sigma }{{ \mathcal F }}_{\alpha }/2$.

This trade-off relation is a generalization of the thermodynamic uncertainty relation to periodically driven systems. In particular, for the case of time-independent transition rates wij(t) = wij, inequality (30) implies the thermodynamic uncertainty relation ${{ \mathcal F }}_{\alpha }^{-1}{{ \mathcal J }}_{\alpha }\leqslant \sigma /2$, since ${\sigma }^{* }=\sigma $ for this case. Furthermore, the inequality ${{ \mathcal F }}_{\alpha }^{-1}{{ \mathcal J }}_{\alpha }\leqslant \tilde{\sigma }/2$, for time-independent transition rates, provides an even tighter bound than the thermodynamic uncertainty relation.

This result is relevant for the mapping between an artificial molecular pump and a system driven by a fixed thermodynamic force such as a biological molecular motor, which is modelled with time-independent transition rates that lead to a nonequilibrium stationary state, proposed in [42]. With this mapping, one can construct a molecular pump that mimicks a stationary state and vice-versa, in the sense that both the average rate of entropy production and the average elementary currents between a pair of states are conserved. However, a mapping of a molecular pump onto a stationary state that also preserves fluctuations is not always possible, since a molecular pump may not fulfill the relation ${{ \mathcal J }}_{\alpha }^{2}/(2{D}_{\alpha }\sigma )\leqslant 1/2$, as shown in [40], whereas a system that reaches a nonequilibrium stationary state must fulfill this relation.

Our trade-off relations do not imply the generalization of the thermodynamic uncertainty relation from [29] for the case of periodic protocols that are symmetric, i.e., $w(\tau /2+{\rm{\Delta }}t)=w(\tau /2-{\rm{\Delta }}t)$, where $0\leqslant {\rm{\Delta }}t\leqslant \tau /2$. The trade-off relation from this reference involves the thermodynamic entropy production σ and for symmetric protocols the rate σ is, in general, different from the rates ${\sigma }^{* }$ and $\tilde{\sigma }$.

3.3. Discussion of the bounds

In figure 3(a), we show plots of ${R}_{\alpha }\equiv {{ \mathcal J }}_{\alpha }^{2}/(2{D}_{\alpha }{\sigma }^{* })\leqslant 1/2$ as a function of the rate γ, which quantifies the speed of the protocol, for the models illustrated in figure 1(b) and in figure 1(c). For the first model, which is a molecular pump, we find that this bound is saturated if the transitions of the protocol are much slower than the internal transition rates associated with changes of the state of the system. For this model, in this limit the bound is saturated independent of the values of the energies and energy barriers. However, for the second model the bound is not saturated in this limit.

Figure 3.

Figure 3. Illustration of the trade-off relation. (a) The ratio ${R}_{\alpha }\equiv {{ \mathcal J }}_{\alpha }^{2}/(2{D}_{\alpha }{\sigma }^{* })\leqslant 1/2$ as a function of the rate γ for jump of the protocol. We have analyzed the model illustrated in figure 1(b) with parameters E1 = 1, B1 = 5, and E2 = E3 = B2 = B3 = 0, and the model illustrated in figure 1(c) with parameters ${F}_{1}={F}_{2}=k=1$. (b) The ratio ${R}_{\alpha }\equiv {{ \mathcal J }}_{\alpha }^{2}/(2{D}_{\alpha }{\sigma }^{* })\leqslant 1/2$ as a function of F1 figure 1(c) with parameters k = γ = 1 and two values of F2.

Standard image High-resolution image

In figure 3(b), we show plots of ${R}_{\alpha }\equiv {{ \mathcal J }}_{\alpha }^{2}/(2{D}_{\alpha }{\sigma }^{* })\leqslant 1/2$ for the model illustrated in figure 1(c). The quantity in the horizontal axis is the thermodynamic force F1. For this model, the bound is saturated for F1 small and the other thermodynamic force F2 = 0. This saturation of the bound is similar to the saturation of the bound for stationary states known as thermodynamic uncertainty relation, which happens in the linear response regime [9].

Let us comment on the rate ${\sigma }^{* }$ that we have introduced here. Its physical interpretation is that ${\sigma }^{* }$, and not the rate of entropy production σ, provides a bound on the whole spectrum of fluctuations for any current (with time-independent increments) in a generic periodically driven system arbitrarily far from equilibrium. In terms of the trade-off relation from equation (30), ${\sigma }^{* }$ (and also $\tilde{\sigma }$) provides a limit on how precise and fast a thermodynamic current can be. The rate of entropy production σ quantifies the energetic cost of sustaining the operation of the nonequilibrium system. Interestingly, for time-independent transition rates corresponding to a system driven by a fixed thermodynamic force, ${\sigma }^{* }=\sigma $ is a rate that has both physical properties, i.e., it bounds current fluctuations and quantifies energetic cost.

3.4.  ${\sigma }^{* }$ as the entropy production of a nonequilibrium stationary state

The rate ${\sigma }^{* }$ of the original periodically driven system can be interpreted as the rate of entropy production associated with the stationary state of an auxiliary Markov process with time-independent transition rates that are determined by time-averaged quantities associated with the original system. These time-averaged quantities are ${\bar{{ \mathcal J }}}_{{ij}}$, defined in equation (21), and

Equation (31)

Both quantities are anti-symmetric, i.e., ${\bar{{ \mathcal J }}}_{{ij}}=-{\bar{{ \mathcal J }}}_{{ji}}$ and ${\theta }_{{ij}}=-{\theta }_{{ji}}$. Moreover, from the definition in equation (31), ${\bar{{ \mathcal J }}}_{{ij}}$ and θij have the same sign. We assume without loss of generality that ${\bar{{ \mathcal J }}}_{{ij}}$ and θij are non-negative .

From equation (20), ${\sigma }^{* }$ can be written as ${\sigma }^{* }={\sum }_{i\lt j}{\bar{{ \mathcal J }}}_{{ij}}{\theta }_{{ij}}$. The transition rates associated with this auxiliary process are denoted by rij and the stationary distribution associated with this process is denoted by pi. The stationary probability currents of this auxiliary process are the time-averaged currents ${\bar{{ \mathcal J }}}_{{ij}}$, hence, we have the constraint

Equation (32)

Furthermore, if we impose

Equation (33)

then the rate of entropy production of the auxiliary process is ${\sigma }^{* }$, i.e., ${\sigma }^{* }={\sum }_{i\lt j}{\bar{{ \mathcal J }}}_{{ij}}\mathrm{ln}({r}_{{ij}}/{r}_{{ji}})$. From the conditions in equations (32) and (33), we obtain

Equation (34)

The reversed rate rji is then given by

Equation (35)

Equation (34) defines a class of stationary states that have entropy production ${\sigma }^{* }$. Since transition rates are non-negative, the stationary probability must satisfy the constraint ${{\rm{e}}}^{{\theta }_{{ij}}}{p}_{i}-{p}_{j}\geqslant 0$. One possible stationary probability that fulfills this constraint for any model is the uniform distribution ${p}_{i}=1/{\rm{\Omega }}$ for i = 1, 2, ..., Ω, since θij ≥ 0.

We can now provide the following physical interpretation for ${\sigma }^{* }$. This rate quantifies the thermodynamic cost to maintain a nonequilibrium stationary state that is determined by the transition rates in equation (34). There are different stationary probabilities that fulfill equation (34), hence, this nonequilibrium stationary state is not unique but rather a class of nonequilibrium stationary states. The network topology of this class of nonequilibrium stationary states is the same as the network topology of the periodically driven system, furthermore, the stationary currents are the same as the time-averaged currents of the periodically driven system. As an example, consider a colloidal particle driven by an external periodic protocol, such as the model represented in figure 1(b). For such molecular pump we can think of a colloidal particle driven by a fixed force that reaches a nonequilibrium stationary state. The force that drives this particle and the specific transition rates that determine its dynamics are obtained from time-averaged quantitative associated with the original molecular pump. The rate ${\sigma }^{* }$ quantifies the energetic cost of driving the colloidal particle with such fixed force.

4. General bounds

In this section we derive the bounds that imply the results discussed in section 3. We obtain two global bounds that imply the global bound in equation (19), the first one is given in equation (52) and the second one is given in equation (72). We also derive a local bound that leads to the inequality in equation (61), which generalizes the trade-off relation in equation (30).

4.1. First global bound

In our proof we use the theory for 2.5 large deviations for periodically driven systems developed in [48]. At the level 2.5 the joint distribution of all empirical densities defined in equation (3) and all empirical currents defined in equation (5) is considered. In our notation ρ(t) represents a vector with the empirical densities that has dimension Ω and J(t) is a vector with the empirical currents that has dimension M, where M is the number of unordered pairs of states with non-zero transition rates. The advantage of considering this level of large deviations is that the rate function can be calculated exactly as

Equation (36)

where

Equation (37)

Equation (38)

and

Equation (39)

Note that the quantities G and a depend on the empirical density ρ. The empirical density and current in equation (36) fulfill the constraint

Equation (40)

for all states i. To simplify the notation we write ${I}_{2.5}^{{\rm{cur}}}[J(t),\rho (t)]$ instead of the lhs of equation (36).

The name level 2.5 large deviations can also refer to the rate function associated with the joint probability of the empirical density and the empirical flow defined in equation (4). The rate function with the empirical current can be obtained from the rate function with the empirical flow [48].

An important technique in large deviation theory is the so called contraction [1619], for which the rate function associated with a coarse-graining of the number of variables can be obtained from the original rate function. Hence, the rate function for an arbitrary current Jα can be obtained from a contraction of ${I}_{2.5}^{{\rm{cur}}}[J(t),\rho (t)]$, which leads to the expression

Equation (41)

where J(t) and ρ(t) are such that they fulfill equation (40) and the relation

Equation (42)

In particular, this relation leads to the inequality

Equation (43)

where $\tilde{G}$ and $\tilde{a}$ are functions of $\tilde{\rho }$ as in (37) and (38). This inequality is valid for any pair of vectors that fulfill the constraints

Equation (44)

for all states i, and

Equation (45)

The inequality [15]

Equation (46)

together with equation (43), leads to

Equation (47)

We are now left with the problem of finding a judicious choice of $(\tilde{J}(t),\tilde{\rho }(t))$ that fulfills the constraints in equation (44) and in equation (45). One such choice is

Equation (48)

Equation (49)

where

Equation (50)

The time-independent parameters Kij are anti-symmetric, i.e., Kij = −Kji, and satisfy

Equation (51)

for all states i. Using this choice in equation (47), we obtain

Equation (52)

where

Equation (53)

and

Equation (54)

The global bound in equation (52), together with equation (25), leads to

Equation (55)

4.2. Role of the parameter K

4.2.1. Generic choice for K

Due to the constraint in equation (51), Kij can be seen as the current of some auxiliary Markov process with time-independent transition rates in the stationary state. A natural choice of Kij is to consider the time-integrated probability current, as defined in equation (21), i.e.,

Equation (56)

For this choice

Equation (57)

and ${\sigma }_{K}^{* }={\sigma }^{* }$, where ${\sigma }^{* }$ is defined in equation (20). For currents with time-independent increments αij(t) = αij, we obtain ${\sum }_{i\lt j}{\bar{{ \mathcal J }}}_{{ij}}{\bar{\alpha }}_{{ij}}={{ \mathcal J }}_{\alpha }$, where ${{ \mathcal J }}_{\alpha }$ is given by equation (9), and the bound in equation (52) becomes the bound in equation (19). For currents with time-dependent increments, which include the rate of extracted work and the rate of heat flow in a heat engine driven by periodic temperature variation, the rate ${{ \mathcal J }}_{K}$ in equation (57) is, in general, different from the average current ${{ \mathcal J }}_{\alpha }$.

4.2.2. Other possible choices for K

The freedom of choice for the parameter K depends on the network of states of the Markov process, with equation (51) limiting the number of independent currents Kij [53]. For instance, for the unicyclic model in figure 1(a), there is just one independent current and Kij is the same for all pairs of states. In this case, the ratio ${\sigma }_{K}^{* }/{{ \mathcal J }}_{K}^{2}$ becomes independent of K and, therefore, there is only one bound in equation (52) regardless of the value of Kij. We note that the same argument about the freedom of choice for the parameter K applies to stochastic protocols, as is the case of the model in figure 1(b).

If we consider a model with the network of states shown in figure 1(c), then there are two independent Kij and different choices for these parameters can lead to different bounds in equation (52). Two particularly appealing choices for the parameter K are the choices that conserve the rate of entropy production or the average current in equation (52). The first choice corresponds to a K that fulfills the relation ${\sigma }_{K}^{* }=\sigma $ and the second choice corresponds to a K that fulfills the relation ${{ \mathcal J }}_{K}={{ \mathcal J }}_{\alpha }$. Whether it is possible to set K in such a way that one of these relations is fulfilled is a question that depends on the model (or class of models) at hand.

4.3. Local bound

We now derive a local quadratic bound on Iα(x) that leads to the first inequality in equation (27). For a and G fixed, a Taylor expansion of the function ψ(J, G, a) for J around the value G, leads to

Equation (58)

Applying this Taylor expansion to equation (43) with $\tilde{\rho }$ and $\tilde{J}$ given by (48) and (49), respectively, we obtain the local bound

Equation (59)

where ${{ \mathcal J }}_{K}$ is defined in equation (53) and

Equation (60)

The local bound in equation (59) together with equation (25) leads to

Equation (61)

A generic model-independent choice for K is the one given in equation (56), i.e., ${K}_{{ij}}={\bar{{ \mathcal J }}}_{{ij}}$. If, in addition, the increments are time-independent, the bound in equation (61) becomes the trade-off relation between speed and precision in equation (30). We recall that from equation (29), ${\sigma }_{K}^{* }\geqslant {\tilde{\sigma }}_{K}$, thus, the bound in equation (61) is stronger than the bound in equation (55).

4.4. Bounds for time-independent transition rates

Here, we stress that the bounds for time-periodic transition rates derived above imply new bounds for the case of time-independent transition rates that lead to a nonequilibrium stationary state. For time-independent transition rates, and for currents with time-independent increments, the terms in equation (52) become

Equation (62)

and

Equation (63)

Hence, from equation (52) we have the bound

Equation (64)

For ${K}_{{ij}}={{ \mathcal J }}_{{ij}}$, equation (64) becomes the known parabolic bound for stationary states from [14, 15]. Furthermore, for time-independent transition rates equation (61) becomes

Equation (65)

where

Equation (66)

This bound is tighter then the bound on the diffusion coefficient that follows from equation (64). For the case ${K}_{{ij}}={{ \mathcal J }}_{{ij}}$, equation (65) becomes an even stronger bound than the thermodynamic uncertainty relation, as discussed in section 3.

4.5. Second global bound

We can obtain a bound different from the global bound in equation (52) by considering a choice for ${\tilde{J}}_{{ij}}(t)$ that is different from the one in equation (49). We write the stationary distribution of a master equation with frozen transition rates wij(t) as μi(t). This quantity is known as accompanying density [54]. Due to the periodicity of wij(t) we have μi(t) = μi(t + τ). We consider the bound in equation (47) with ${\tilde{\rho }}_{i}(t)={\pi }_{i}(t)$ and

Equation (67)

where c1(t) and c2(t) are time-periodic functions, Ki,j is anti-symmetric and fulfill the relation in equation (51), and

Equation (68)

Since ${\sum }_{j\ne i}{M}_{{ij}}(t)=0$, which comes from the definition of the accompanying density μi(t), this choice fulfills the constraint in equation (44). Setting ${K}_{{ij}}={\bar{{ \mathcal J }}}_{{ij}}$, c1(t) = c1, and c2(t) = c2, the constraint in equation (45) applied to the choice in equation (67), leads to

Equation (69)

Equation (70)

where q is an arbitrary real number and

Equation (71)

The bound in equation (47) then becomes

Equation (72)

Minimization over the single parameter q gives the tightest bound on the large deviation function. For q = 0 we obtain the bound in equation (52) with ${K}_{{ij}}={\bar{{ \mathcal J }}}_{{ij}}$. However, for q = 1 we obtain a bound that cannot be obtained from equation (52), which reads

Equation (73)

where

Equation (74)

5. Conclusion

The thermodynamic uncertainty relation and the parabolic bound on current fluctuations that generalizes it, constitute major recent developments in stochastic thermodynamics that are valid for Markov processes with time-independent transition rates that reach a stationary state, which describes a system driven by fixed thermodynamic forces. We have generalized these bounds to periodically driven systems. Similar to the bound for stationary states, we obtained a bound that depends on the single average rate ${\sigma }^{* }$ and on the average current. However, for periodically driven systems this average rate is, in general, different from the thermodynamic entropy production σ. These rates have two essential physical properties: while σ quantifies the energetic cost of maintaining the system out of equilibrium, ${\sigma }^{* }$ provides a generic limit to current fluctuations.

The quite high degree of universality of our results are encouraging with respect to possible applications. For instance, we have found a trade-off relation between speed and precision in periodically driven systems for currents that have time-independent increments. Physically, such relation tells us that if one wants to generate net motion in a artificial molecular pump driven by an external periodic protocol, there is a universal limit on how fast and precise this net motion can be.

For the case of the thermodynamic uncertainty relation for stationary states, several applications have been proposed [1013]. Figuring out how to extend these applications to periodically driven systems is an interesting direction for future work. One particular instance would be to extend the universal relation between power, efficiency and fluctuations from [12] to periodically driven heat engines. The more general bounds derived in section 4 that apply to time-dependent increments, might be important for these applications. Finally, good candidates for an experimental observation of the bounds we have derived here are periodically driven colloidal particles and artificial molecular pumps.

: Appendix. Stochastic protocol

A.1. Mathematical definitions

The master equation for the model with a stochastic protocol reads

Equation (A.1)

where n − 1 = N − 1 for n = 0 and ${P}_{i}^{n}$ is the time-dependent distribution. The stationary distribution of state (i, n) is denoted by ${\pi }_{i}^{n}$. The stationary distribution of the state n of the protocol is given by ${\pi }^{n}\equiv {\sum }_{i}{\pi }_{i}^{n}=1/N$, which comes from the solution of the master equation (A.1) for the stationary distribution. The conditional probability for the system to be in state i given that the protocol is in state n is written as $\pi (i| n)={\pi }_{i}^{n}/{\pi }^{n}\,=\,N{\pi }_{i}^{n}$. Consider a time-periodic Markov process with rates wij(t) and period τ. If the transition rates fulfill the relation ${w}_{{ij}}^{n}={w}_{{ij}}(t=n\tau /N)$ and γ = N/τ, then, in the limit $N\to \infty ,\pi (i| n)\to {\pi }_{i}(t)$ [40], where $n=[{tN}/\tau ]$ and $[\cdot ]$ denotes the integer part. Therefore, if we consider the average elementary current ${{ \mathcal J }}_{{ij}}^{n}\equiv {\pi }_{i}^{n}{w}_{{ij}}^{n}-{\pi }_{j}^{n}{w}_{{ji}}^{n}$ in the limit of $N\to \infty $, we obtain

Equation (A.2)

where $n=[{tN}/\tau ]$. This relation is important for the connection between the cases of a deterministic and stochastic protocols.

A stochastic trajectory is denoted by ${({b}_{t})}_{0\leqslant t\leqslant {t}_{f}}$, where tf is the final time. Note that a state of the Markov process here is specified by the variable that determines the state of the system i and the variable that determines the state of the protocol n. The stochastic trajectory has a fluctuating number of jumps Nf, the time interval between two jumps is denoted Δ tk, with k = 0, 1, ..., Nf, and the state of the Markov process during the time interval Δ tk is denoted bk.

The empirical density of state (i, n), which is the fraction of time spent in this state, is defined as

Equation (A.3)

${\delta }_{{b}_{k},(i,n)}$ is the Kronecker delta between the state of the trajectory bk and the state (i, n). The notation here in the appendix is different from the notation in the main text for the case of a deterministic protocol. If we compare equation (A.3) with equation (3), we see that here the upper index in ${\rho }_{i}^{n}$ refers to the state of the stochastic protocol and is equivalent to t in ${\rho }_{i}^{(m)}(t)$, for which the upper index m refers to the time interval of the stochastic trajectory. For a more compact notation we do not keep the dependence of the fluctuating quantities on the time interval tf.

The empirical current from state (i, n) to state (j, n) reads

Equation (A.4)

For the case of a stochastic protocol, we also consider the empirical flow (or unidirectional current) from state (i, n) to state (i, n + 1), where n + 1 = 0 for n = N − 1, which is defined as

Equation (A.5)

The average of this empirical flow in the stationary state is ${{ \mathcal C }}_{i}^{n}\equiv \langle {C}_{i}^{n}\rangle =\gamma {\pi }_{i}^{n}$.

A generic fluctuating current is written as

Equation (A.6)

where ${\alpha }_{{ij}}^{n}=-{\alpha }_{{ji}}^{n}$ are the increments. If we compare this expression with equation (7), which is the expression for a deterministic protocol, we see that an integral over a period divided by the period τ for a deterministic protocol becomes a sum over n divided by the total number of states of the protocol N for a stochastic protocol. Note that the factor 1/N does not appear in front of the sum in the rhs of equation (A.6) due to equation (A.2). The average current in the stationary state reads

Equation (A.7)

The rate function associated with Jα is defined as

Equation (A.8)

where ∼ means asymptotic equality in the limit ${t}_{f}\to \infty $. The scaled cumulant generating function for a stochastic protocol is defined as

Equation (A.9)

These two quantities are related by a Legendre–Fenchel transform, as in equation (14).

Similar to equation (21) and equation (50) for a deterministic protocol, we define

Equation (A.10)

and

Equation (A.11)

respectively. Furthermore, we define

Equation (A.12)

which is equivalent to (53),

Equation (A.13)

which is equivalent to equation (54), and

Equation (A.14)

which is equivalent to equation (60). The parameter Kij in these equations is anti-symmetric, i.e., Kij = −Kji, and thus fulfill ${\sum }_{j\ne i}\,{K}_{{ij}}=0$ for all i.

A.2. Proofs of the bounds

We now consider the joint distribution of the vector of empirical densities ρ, the vector of empirical currents J, and the vector of the empirical flow C. The level 2.5 rate function [46] for this Markov process reads

Equation (A.15)

where

Equation (A.16)

and

Equation (A.17)

The quantities in this rate function fulfill the constraint

Equation (A.18)

for all i and n.

Applying a contraction to obtain Iα(x) from I2.5[J, C, ρ], as in equation (41) for a deterministic protocol, and setting ${\rho }_{i}^{n}={\pi }_{i}^{n}$ and ${C}_{i}^{n}=\gamma {\pi }_{i}^{n}$, we obtain

Equation (A.19)

where ${\tilde{J}}_{{ij}}^{n}$ fulfill the constraints

Equation (A.20)

and

Equation (A.21)

for all i and n.

The global bound on large deviations is obtained by setting

Equation (A.22)

and by using the inequality in equation (46). With these operations, equation (A.19) becomes

Equation (A.23)

which is the global bound for a stochastic protocol.

The choice in equation (A.22) and the Taylor expansion in equation (58), together with equation (A.19) lead to the local bound

Equation (A.24)

Using the relation (25) for the diffusion coefficient we obtain the bound

Equation (A.25)

The choice ${K}_{{ij}}={\bar{{ \mathcal J }}}_{{ij}}$ for a stochastic protocol leads to bounds similar to the bounds discussed in section 4.2.1 for a deterministic protocol.

A bound similar to the bound in equation (72) for a stochastic protocol can be obtained by setting ${\tilde{\rho }}_{i}^{n}={\pi }_{i}^{n}$ and

Equation (A.26)

where

Equation (A.27)

and ${\mu }_{i}^{n}$ is the solution of the stationary master equation ${\sum }_{j\ne i}({\mu }_{i}^{n}{w}_{{ij}}^{n}-{\mu }_{j}^{n}{w}_{{ji}}^{n})=0$. Defining

Equation (A.28)

and setting

Equation (A.29)

Equation (A.30)

leads to the fulfillment of the constraint in equation (A.20). With this choice for ${\tilde{\rho }}_{i}^{n}$ and ${\tilde{J}}_{{ij}}^{n}$, the bound in equation (A.19) becomes

Equation (A.31)

In particular, for q = 1 we obtain

Equation (A.32)

where

Equation (A.33)

Please wait… references are loading.