Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets

We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of Thermodynamics that relies on the Kolmogorov-Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly---for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds.


I. INTRODUCTION
The Second Law of Thermodynamics is only statistically true: while the entropy production in any process is nonnegative on the average, ∆S ≥ 0, if we wait long enough, we shall see individual events for which the entropy production is negative. This is nicely summarized in the recent fluctuation theorem for the probability of entropy production ∆S [1][2][3][4][5][6][7]: implying that negative entropy production events are exponentially rare but not impossible. Negative entropy fluctuations were known much before this modern formulation. In fact, in 1867 J. C. Maxwell used the negative entropy fluctuations in a clever thought experiment, involving an imaginary intelligent being-later called Maxwell's Demon-that exploits fluctuations to violate the Second Law [8,9]. The Demon controls a small frictionless trapdoor on a partition inside a box of gas molecules to sort, without any expenditure of work, faster molecules to one side and slower ones to the other. This gives rise to a temperature gradient from an initially uniform system-a violation of the Second Law. Note that the "very observant and neat fingered" Demon's "intelligence" is necessary; a frictionless trapdoor connected to a spring acting as a valve, for example, cannot achieve the same feat [10]. Maxwell's Demon posed a fundamental challenge. Either such a Demon could not exist, even in principle, or the Second Law itself needed modification. A glimmer of a resolution came with L. Szilard's reformulation of Maxwell's Demon in terms of measurement and feedbackcontrol of a single-molecule engine. Critically, Szilard emphasized hitherto-neglected information-theoretic aspects of the Demon's operations [11]. Later, through the works of R. Landauer, O. Penrose, and C. Bennett, it was recognized that the Demon's operation necessarily accumulated information and, for a repeating thermodynamic cycle, erasing this information has an entropic cost that ultimately compensates for the total amount of negative entropy production leveraged by the Demon to extract work [12][13][14]. In other words, with intelligence and information-processing capabilities, the Demon merely shifts the entropy burden temporarily to an information reservoir, such as its memory. The cost is repaid whenever the information reservoir becomes full and needs to be reset. This resolution is concisely summarized in Landauer's Principle [15]: the Demon's erasure of one bit of information at temperature T K requires at least k B T ln 2 amount of heat dissipation, where k B is Boltzmann's constant. (While it does not affect the following directly, it has been known for some time that this principle is only a special case [16].) Building on this, a modified Second Law was recently proposed that explicitly addresses information processing in a thermodynamic system [17,18]: where ∆H is the change in the information reservoir's configurational entropy over a thermodynamic cycle. This is the change in the reservoir's "information-bearing degrees of freedom" as measured using Shannon information H [19]. These degrees of freedom are coarsegrained states of the reservoir's microstates-the mesoscopic states that store information needed for the Demon's thermodynamic control. Importantly for the following, this Second Law assumes explicitly observed Markov system dynamics [17] and quantifies this relevant information only in terms of the distribution of instantaneous system microstates; not, to emphasize, microstate path entropies. In short, while the system's instantaneous distributions relax and change over time, the information reservoir itself is not allowed to build up and store memory or correlations. Note that this framework differs from alternative approaches to the thermodynamics of information processing, including: (i) active feedback control by external means, where the thermodynamic account of the Demon's activities tracks the mutual information between measurement outcomes and system state [20][21][22][23][24][25][26][27][28][29][30][31][32][33]; (ii) the multipartite framework where, for a set of interacting, stochastic subsystems, the Second Law is expressed via their intrinsic entropy production, correlations among them, and transfer entropy [34][35][36][37]; and (iii) steady-state models that invoke time-scale separation to identify a portion of the overall entropy production as an information current [38,39]. A unified approach to these perspectives was attempted in Refs. [40][41][42].
Recently, Maxwellian Demons have been proposed to explore plausible automated mechanisms that appeal to Eq. (2)'s modified Second Law to do useful work, by deceasing the physical entropy, at the expense of positive change in reservoir Shannon information [39,[43][44][45][46][47][48]. Paralleling the modified Second Law's development and the analyses of the alternatives above, they too ne-glect correlations in the information-bearing components and, in particular, the mechanisms by which those correlations develop over time. In effect, they account for Demon information-processing by replacing the Shannon information of the components as a whole by the sum of the components' individual Shannon informations. Since the latter is larger than the former [19], using it can lead to either stricter or looser bounds than the true bound which is derived from differences in total configurational entropies. More troubling, though, bounds that ignore correlations can simply be violated. Finally, and just as critically, they refer to configurational entropies, not the intrinsic dynamical entropy over system trajectories.
This Letter proposes a new Demon for which, for the first time, all correlations among system components can be explicitly accounted. This gives an exact, analytical treatment of the thermodynamically relevant Shannon information change-one that, in addition, accounts for system trajectories not just information in instantaneous state distributions. The result is that, under minimal assumptions, we derive a Second Law that refines Eq. (2) by properly accounting for intrinsic information processing reflected in temporal correlations via the overall dynamic's Kolmogorov-Sinai entropy [49].
Notably, our Demon is highly functional: Depending on model parameters, it acts both as an engine, by extracting energy from a single reservoir and converting it into work, and as an information eraser, erasing Shannon information at the cost of the external input of work. Moreover, it supports a new and counterintuitive thermodynamic functionality. In contrast with previously reported erasure operations that only decreased singlebit uncertainty, we find a new kind of erasure functionality during which multiple-bit uncertainties are removed by adding correlation (i.e., by adding temporal order), while single-bit uncertainties are actually increased. This new thermodynamic function provocatively suggests why real-world ratchets support memory: The very functioning of memoryful Demons relies on leveraging temporally correlated fluctuations in their environment.

II. INFORMATION RATCHETS
Our model consists of four components, see Fig. 1: (1) an ensemble of bits that acts as an information reservoir; (2) a weight that acts as a reservoir for storing work; (3) a thermal reservoir at temperature T ; and (4) a finite-state ratchet that mediates interactions between the three reservoirs. The bits interact with the ratchet sequentially and, depending on the incoming bit statistics and Demon parameters, the weight is either raised or lowered against gravity.
As a device that reads and processes a tape of bits, this class of ratchet model has a number of parallels that we mention now, partly to indicate possible future applications. First, one imagines a sophisticated, stateful biomolecule that scans a segment of DNA, say as a DNA polymerase does, leaving behind a modified sequence of nucleotide base-pairs [50] or that acts as an enzyme sequentially catalyzing otherwise unfavorable reactions [51]. Second, there is a rough similarity to a Turing machine sequentially recognizing tape symbols, updating its internal state, and taking an action by modifying the tape cell and moving its read-write head [52]. When the control logic is stochastic, this sometimes is referred to as "Brownian computing" [53, and references therein]. Finally, we are reminded of the deterministic finite-state tape processor of Ref. [54] that, despite its simplicity, indicates how undecidability can be imminent in dynamical processes. Surely there are other intriguing parallels, but these give a sense of a range of applications in which sequential information processing embedded in a thermodynamic system has relevance. Information ratchet sequentially processing a bit string: At time step N , XN is the random variable for the ratchet state and ZN that for the thermal reservoir. YN:∞ is the block random variable for the input bit string and Y 0:N that for the output bit string. The last bit YN of the input string, highlighted in yellow, interacts with the ratchet. The arrow on the right of the ratchet indicates the direction the ratchet moves along the tape as it sequentially interacts with each input bit in turn.
The bit ensemble is a semi-infinite sequence, broken into incoming and outgoing pieces. The ratchet runs along the sequence, interacting with each bit of the input string step by step. During each interaction at step N , the ratchet state X N and interacting bit Y N fluctuate between different internal joint states within X N ⊗ Y N , exchanging energy with the thermal reservoir and work reservoir, and potentially changing Y N 's state. At the end of step N , after input bit Y N interacts with the ratchet, it becomes the last bit Y N of the output string. By interacting with the ensemble of bits, transducing the input string into the output string, the ratchet can convert thermal energy from the heat reservoir into work energy stored in the weight's height.
The ratchet interacts with each incoming bit for a time interval τ , starting at the 0th bit Y 0 of the input string. After N time intervals, input bit Y N −1 finishes interacting with the ratchet and, with the coupling removed, it is effectively "written" to the output string, becoming Y N −1 . The ratchet then begins interacting with input bit Y N . As Fig. 1 illustrates, the state of the overall system is described by the realizations of four random variables: X N for the ratchet state, Y N :∞ for the input string, Y 0:N for the output string, and Z N for the thermal reservoir. A random variable like X N realizes elements x N of its physical state space, denoted by alphabet X , with probability Pr(X N = x N ). Random variable blocks are denoted Y a:b = Y a Y a+1 . . . Y b−1 , with the last index being exclusive. In the following, we take binary alphabets for Y and Y : y N , y N ∈ {0, 1}. The bit ensemble is considered two joint variables Y 0: . . rather than one Y 0:∞ , so that the probability of realizing a word w ∈ {0, 1} b−a in the output string is not the same as in the input string. That is, during ratchet operation typically Pr( The ratchet steadily transduces the input bit sequence, described by the input word distribution Pr(Y 0:∞ ) ≡ {Pr(Y 0:∞ = w)} w∈{0,1} ∞ -the probability for every semi-infinite input word-into the output string, described by the word distribution Pr(Y 0: 1} . We assume that the word distributions we work with are stationary, meaning that Pr(Y a:a+b ) = Pr(Y 0:b ) for all nonnegative integers a and b.
A key question in working with a sequence such as Y 0:∞ is how random it is. One commonly turns to information theory to provide quantitative measures: the more informative a sequence is, the more random it is. For words at a given length the average amount of information in the Y 0:∞ sequence is given by the Shannon block entropy [55]: Due to correlations in typical process sequences, the irreducible randomness per symbol is not the single-symbol entropy H[Y 0 ]. Rather, it is given by the Shannon entropy rate [55]: When applied to a physical system described by a suitable symbolic dynamics, as done here, this quantity is the Kolmogorov-Sinai dynamical entropy of the underlying physical behavior. Note that these ways of monitoring information are quantitatively quite different. For large , h µ H[Y 0: ] and, in particular, anticipating later use, h µ ≤ H[Y 0 ], typically much less. Equality between the single-symbol entropy and entropy rate is only achieved when the generating process is memoryless. Calculating the singlesymbol entropy is typically quite easy, while calculating h µ for general processes has been known for quite some time to be difficult [56] and it remains a technical challenge [57]. The entropy rates of the output sequence and input sequence are h µ = lim →∞ H[Y 0: ]/ and h µ = lim →∞ H[Y 0: ]/ , respectively.
The informational properties of the input and output word distributions set bounds on energy flows in the system. Appendix A establishes one of our main results: The average work done by the ratchet is bounded above by the difference in Kolmogorov-Sinai entropy of the input and output processes [58]: In light of the preceding remarks on the basic difference between H[Y 0 ] and h µ , we can now consider more directly the differences between Eqs. (2) and (5). Most importantly, the ∆H in the former refers to the instantaneous configurational entropy H before and after a thermodynamic transformation. In the ratchet's steady state operation, ∆H vanishes since the configuration distribution is time invariant, even when the overall system's information production is positive. The entropies h µ and h µ in Eq. (5), in contrast, are dynamical: rates of active information generation in the input and output giving, in addition, the correct minimum rates since they take all temporal correlations into account. Together they bound the overall system's information production in steady state away from zero. In short, though often conflated, configurational entropy and dynamical entropy capture two very different kinds of information and they, per force, are associated with different physical properties supporting different kinds of information processing. They are comparable only in special cases. For example, if one puts aside this basic difference to facilitate comparison and considers the Shannon entropy change ∆H in the joint state space of all bits, the two equations are analogous in the current setup. However, often enough, a weaker version of Eq. (2) is considered in the discussions on Maxwell's Demon [41,[43][44][45]59] and information reservoirs [18], wherein the statistical correlations between the bits are neglected, and one simply interprets ∆ H to be the change in the marginal Shannon entropies H[Y 0 ] of the individual bits. This implies the following relation in the current context: (6) is valid for the studies in Refs. [18,41,[43][44][45]59], it cannot be taken as a fundamental law, because it can be violated [60]. In comparison, Eq. (5) is always valid and can even provide a stronger bound.
As an example, consider the case where the ratchet has memory and, for simplicity of exposition, is driven by an uncorrelated input process, meaning the input process entropy rate is the same as the single-symbol entropy: However, the ratchet's memory can create correlations in the output bit string, so: In this case, Eq. (5) is a tighter bound on the work done by the ratchet-a bound that explicitly accounts for correlations within the output bit string the ratchet generates during its operation. For example, for the combination {p = 0.5, q = 0.1, b = 0.9}, two bits in the outgoing string are correlated even when they are separated by 13 steps. Previously, the effect of these correlations has not been calculated, but they have important consequences. Due to correlations, it is possible to have an increase in the single-symbol entropy difference ∆H[Y 0 ] but a decrease in the Kolmogorov-Sinai entropy rate ∆h µ . In this situation, it is erroneous to assume that there is an increase in the information content in the bits. There is, in fact, a decrease in information because of the correlations; cf. Sec. V. Note that a somewhat different situation was considered in Ref. [59], a memoryless channel (ratchet) driven by a correlated process. In this special case-ratchets unable to leverage or create temporal correlations-either Eq. (6) or Eq. (5) can be a tighter quantitative bound on work. When a memoryless ratchet is driven by uncorrelated input, though, the bounds are equivalent. Critically, for memoryful ratchets driven by correlated input Eq. (6) can be violated. In all settings, Eq. (5) holds.
While we defer it's development to a sequel, Eq. (5) also has implications for ratchet functioning when the input bits are correlated as well. Specifically, correlations in the input bits can be leveraged by the ratchet to do additional work-work that cannot be accounted for if one only considers single-symbol configurational entropy of the input bits [61].
Energy levels of the Demon states, interacting bits, their joint system, and their joint system with a weight in units of [kBT ].

III. ENERGETICS AND DYNAMICS
To predict how the ratchet interacts with the bit string and weight, we need to specify the string and ratchet energies. When not interacting with the ratchet the energies, E 0 and E 1 , of both bit states, Y = 0 and Y = 1, are taken to be zero for symmetry and simplicity: E 0 = E 1 = 0. For simplicity, too, we say the ratchet mechanism has just two internal states A and B. When the ratchet is not interacting with bits, the two states can have different energies. We take E A = 0 and E B = −αk B T , without loss of generality. Since the bits interact with the ratchet one at a time, we only need to specify the interaction energy of the ratchet and an individual bit. The interaction energy is zero if the bit is in the state Y = 0, regardless of the ratchet state, and it is −βk B T (or +βk B T ) if the bit is in state Y = 1 and the ratchet is in state A (or B). See Fig. 2 for a graphical depiction of the energy scheme under "Ratchet ⊗ Bit".
The scheme is further modified by the interaction of the weight with the ratchet and bit string. We attach the weight to the ratchet-bit system such that when the latter transitions from the B ⊗0 state to the A⊗1 state it lifts the weight, doing a constant amount wk B T of work. As a result, the energy of the composite system-Demon, interacting bit, and weight-increases by wk B T whenever the transition B ⊗ 0 → A ⊗ 1 takes place, the required energy being extracted from the heat reservoir Z N . The rightmost part of Fig. 2 indicates this by raising the energy level of A ⊗ 1 by wk B T compared to its previous value. Since the transitions between A ⊗ 1 and B ⊗ 1 do not involve the weight, their relative energy difference remains unaffected. An increase in the energy of A⊗1 by wk B T therefore implies the same increase in the energy of B ⊗ 1. Again, see Fig. 2 for the energy scheme under "Ratchet ⊗ Bit ⊗ Weight".
The time evolution over the joint state space of the ratchet, last bit of the input string, and weight is governed by a Markov dynamic, specified by state-transition matrix M . If, at the beginning of the N th interaction interval at time t = τ (N − 1) + 0 + , the ratchet is in state X N = x N and the input bit is in state Y N = y N , then let M x N ⊗y N →x N +1 ⊗y N be the probability Pr(x N +1 , y N |x N , y N ) that the ratchet is in state X N = x N +1 and the bit is in state Y N = y N at the end of the interaction interval t = τ (N − 1) + τ − . X N and Y N at the end of the N th interaction interval become X N +1 and Y N respectively at the beginning of the N + 1th interaction interval. Since we assume the system is thermalized with a bath at temperature T , the ratchet dynamics obey detailed balance. And so, transition rates are governed by the energy differences between joint states: There is substantial flexibility in constructing a detailed-balanced Markov dynamic for the ratchet, interaction bit, and weight. Consistent with our theme of simplicity, we choose one that has only six allowed tran- Such a model is convenient to consider, since it can be described by just two transition probabilities 0 ≤ p ≤ 1 and 0 ≤ q ≤ 1, as shown in Fig. 3.
The Markovian, detailed-balance dynamic over the joint states of the ratchet and interacting bit.
The Markov transition matrix for this system is given by: This allows allows us to calculate the state distribution p((N − 1)τ + τ − ) at the end of the N th interaction interval from the state distribution p((N − 1)τ + 0 + ) at the interval's beginning via: where the probability vector is indexed p = (Pr(A ⊗ 0), Pr(B ⊗ 0), Pr(A ⊗ 1), Pr(B ⊗ 1)) . To satisfy detailed balance, we find that α, β, and w should be: (Appendix B details the relationships between the transitions probabilities and energy levels.) This simple model is particularly useful since, as we show shortly, it captures the full range of thermodynamic functionality familiar from previous models and, more importantly, it makes it possible to exactly calculate informational properties of the output string analytically. Now that we know how the ratchet interacts with the bit string and weight, we need to characterize the input string to predict the energy flow through the ratchet. As in the ratchet models of Refs. [43,47], we consider an input generated by a biased coin-Pr(Y N = 0) = b at each N -which has no correlations between successive bits. For this input, the steady state distributions at the beginning and end of the interaction interval τ are: These distributions are needed to calculate the work done by the ratchet.
To calculate net extracted work by the ratchet we need to consider three work-exchange steps for each interaction interval: (1) when the ratchet gets attached to a new bit, to account for their interaction energy; (2) when the joint transitions B ⊗ 0 ↔ A ⊗ 1 take place, to account for the raising or lowering of the weight; and (3) when the ratchet detaches itself from the old bit, again, to account for their nonzero interaction energy. We refer to these incremental works as W 1 , W 2 , and W 3 , respectively.
Consider the work W 1 . If the new bit is in state 0, from Fig. 2 we see that there is no change in the energy of the joint system of the ratchet and the bit. However, if the new bit is 1 and the initial state of the ratchet is A, energy of the ratchet-bit joint system decreases from 0 to −β. The corresponding energy is gained as work by the mechanism that makes the ratchet move past the tape of bits. Similarly, if the new bit is 1 and the initial state of the ratchet is B, there is an increase in the joint state energy by β; this amount of energy is now taken away from the driving mechanism of the ratchet. In the steady state, the average work gain W 1 is then obtained from the average decrease in energy of the joint (ratchet-bit) system: where we used the probabilities in Eq. (14) and Fig. 2's energies. By a similar argument, the average work W 3 is equal to the average decrease in the energy of the joint system on the departure of the ratchet, given by: Note that the cost of moving the Demon on the bit string (or moving the string past a stationary Demon) is accounted for in works W 1 and W 3 . Work W 2 is associated with raising and lowering of the weight depicted in Fig. 1. Since transitions B ⊗ 0 → A ⊗ 1 raise the weight to give work k B T w and reverse transitions B ⊗ 0 ← A ⊗ 1 lower the weight consuming equal amount of work, the average work gain W 2 must be k B T w times the net probability transition along the former direction, which is [T B⊗0→A⊗1 p s . This leads to the following expression: where we used the probabilities in Eq. (14). The total work supplied by the ratchet and a bit is their sum: Note that we considered the total amount amount of work that can be gained by the system, not just that obtained by raising the weight. Why? As we shall see in Sec. V, the former is the thermodynamically more relevant quantity. A similar energetic scheme that incorporates the effects of interaction has also been discussed in Ref. [48].
In this way, we exactly calculated the work term in Eq. (5). We still need to calculate the entropy rate of the output and input strings to validate the proposed Second Law. For this, we introduce an information-theoretic formalism to monitor processing of the bit strings by the ratchet.

IV. INFORMATION
To analytically calculate the input and output entropy rates, we consider how the strings are generated. A natural way to incorporate temporal correlations in the input string is to model its generator by a finite-state hidden Markov model (HMM), since HMMs are strictly more powerful than Markov chains in the sense that finite-state HMMs can generate all processes produced by Markov chains, but the reverse is not true. For example, there are processes generated by finite HMMs that cannot be by any finite-state Markov chain. In short, HMMs give a compact representations for a wider range of memoryful processes.
Consider possible input strings to the ratchet. With or without correlations between bits, they can be described by an HMM generator with a finite set of, say, K states and a set of two symbol-labeled transition matrices T (0) and T (1) , where: is the probability of outputting y N for the N th bit of the input string and transitioning to internal state s N +1 given that the HMM was in state s N . When it comes to the output string, in contrast, we have no choice. We are forced to use HMMs. Since the current input bit state Y N and ratchet state X N are not explicitly captured in the current output bit state Y N , Y N and X N are hidden variables. As we noted before, calculating HMM entropy rates is a known challenging problem [56,57]. Much of the difficulty stems from the fact that in HMM-generated processes the effects of internal states are only indirectly observed and, even then, appear only over long output sequences.
We can circumvent this difficulty by using unifilar HMMs, in which the current state and generated symbol uniquely determine the next state. This is a key technical contribution here since for unifilar HMMs the entropy rate is exactly calculable, as we now explain. Unifilar HMMs internal states are a causal partitioning of the past, meaning that every past w maps to a particular state through some function f and so: As a consequence, the entropy rate h µ in its blockentropy form (Eq. (4)) can be re-expressed in terms of the transition matrices. First, recall the alternative, equivalent form for entropy rate: This finally leads to a closed-form for the entropy rate [55]: where π is the stationary distribution over the unifilar HMM's states. Let's now put these observations to work. Here, we assume the ratchet's input string was generated by a memoryless biased coin. Figure 4 shows its (minimal-size) unifilar HMM. The single internal state C implies that the process is memoryless and the bits are uncorrelated. The HMM's symbol-labeled (1 × 1) transition matrices are T (0) = [b] and T (1) = [1 − b]. The transition from state C to itself labeled 0 : b means that if the system is in state C, then it transitions to state C and outputs Y = 0 with probability b. Since this model is unifilar, we can calculate the input-string entropy rate from Eq. (21) and see that it is the single-symbol entropy of bias b: where H(b) is the (base 2) binary entropy function [19]. The more challenging part of our overall analysis is to determine the entropy rate of the output string. Even if the input is uncorrelated, it's possible that the ratchet creates temporal correlations in the output string. (Indeed, these correlations reflect the ratchet's operation and so its thermodynamic behavior, as we shall see below.) To calculate the effect of these correlations, we need a generating unifilar HMM for the output process-a process produced by the ratchet being driven by the input.
When discussing the ratchet energetics, there was a Markov dynamic M over the ratchet-bit joint state space. Here, it is now controlled by bits from the input string and writes the result of the thermal interaction with the ratchet to the output string. In this way, M becomes an input-output machine or transducer [62]. In fact, this transducer is a communication channel in the sense of Shannon [63] that communicates the input bit sequence to the output bit sequence. However, it is a channel with memory. Its internal states correspond to the ratchet's states. To work with M , we rewrite it componentwise as: to evoke its re-tooled operation. The probability of generating bit y N and transitioning to ratchet state x N +1 , given that the input bit is y N and the ratchet is in state x N , is: This allows us to exactly calculate the symbol-labeled transition matrices, T (0) and T (1) , of the HMM that generates the output string: The joint states of the ratchet and the internal states of the input process are the internal states of the output HMM, with x N , x N +1 ∈ {A, B} and s N , s N +1 ∈ {C} in the present case. This approach is a powerful tool for directly analyzing informational properties of the output process. By adopting the transducer perspective, it is possible to find HMMs for the output processes of previous ratchet models, such as in Refs. [43,47]. However, their generating HMMs are highly nonunifilar, meaning that knowing the current internal state and output allows for many alternative internal-state paths. And, this precludes writing down closed-form expressions for informational quantities, as we do here. Said simply, the essential problem is that those models build in too many transitions. Ameliorating this constraint led to the Markov dynamic shown in Fig. 3 with two ratchet states and sparse transitions. Although this ratchet's behavior cannot be produced by a rate equation, due to the limited transitions, it respects detailed balance. Figure 5 shows our two-state ratchet's transducer. As noted above, it's internal states are the ratchet states. Each transition is labeled y |y : p, where y is the output, conditioned on an input y, with probability p.
We can drive this ratchet (transducer) with any input, but for comparison with previous work, we drive it with the memoryless biased coin process just introduced and shown in Fig. 4. The resulting unifilar HMM for the output string is shown in Fig. 6. The corresponding symbol-labeled transition matrices are: FIG. 6. Unifilar HMM for the output string generated by the ratchet driven by a coin with bias b.
Using these we can complete our validation of the proposed Second Law, by exactly calculating the entropy rate of the output string. We find: We note that this is less than or equal to the (unconditioned) single-symbol entropy for the output process: Any difference between h µ and single-symbol entropy H[Y 0 ] indicates correlations that the ratchet created in the output from the uncorrelated input string. In short, the entropy rate gives a more accurate picture of how information is flowing between bit strings and the heat bath. And, as we now demonstrate, the entropy rate leads to correctly identifying important classes of ratchet thermodynamic functioning-functionality the single-symbol entropy misses.

V. THERMODYNAMIC FUNCTIONALITY
Let's step back to review and set context for exploring the ratchet's thermodynamic functionality as we vary its parameters. Our main results are analytical, provided in closed-form. First, we derived a modified version of the Second Law of Thermodynamics for information ratchets in terms of the difference between the Kolmogorov-Sinai entropy of the input and output strings: where ∆h µ = h µ − h µ . The improvement here takes into account correlations within the input string and those in the output string actively generated by the ratchet during its operation. From basic information-theoretic identities we know this bound is stricter for memoryless inputs than previous relations [64] that ignored correlations. However, by how much? And, this brings us to our second main result. We gave analytic expressions for both the input and output entropy rates and the work done by the Demon. Now, we are ready to test that the bound is satisfied and to see how much stricter it is than earlier approximations. We find diverse thermodynamic behaviors as shown in Figure 7, which describes ratchet thermodynamic function at input bias b = 0.9. We note that there are analogous behaviors for all values of input bias. We identified  7. Information ratchet thermodynamic functionality at input bias b = 0.9: Engine: (p, q) such that 0 < W ≤ kBT ln 2 ∆hµ. Eraser: (p, q) such that W ≤ kBT ln 2 ∆hµ < 0. Dud: (p, q) such that W ≤ 0 ≤ kBT ln 2 ∆hµ. three possible behaviors for the ratchet: Engine, Dud, and Eraser. Nowhere does the ratchet violate the rule W ≤ k B T ln 2 ∆h µ . The engine regime is defined by (p, q) for which k B T ln 2 ∆h µ ≥ W > 0 since work is positive. This is the only condition for which the ratchet extracts work. The eraser regime is defined by 0 > k B T ln 2 ∆h µ ≥ W , meaning that work is extracted from the work reservoir while the uncertainty in the bit string decreases. In the dud regime, those (p, q) for which k B T ln 2 ∆h µ ≥ 0 ≥ W , the ratchet is neither able to erase information nor is it able to do useful work.
At first blush, these are the same behavior types reported by Ref. [43], except that we have stronger bounds on the work now with k B T ln 2 ∆h µ , compared to the single-symbol entropy approximation. The stricter bound gives deeper insight into ratchet functionality. To give a concrete comparison, Fig. 8 plots the singlesymbol entropy difference ∆ H[Y 0 ] and the entropy rate difference ∆h µ , with a flat surface identifying zero entropy change, for all p and q and at b = 0.9. In the present setting where input symbols are uncorrelated, the blue ∆H[Y 0 ] surface lies above the red ∆h µ surface for all parameters, confirming that the single-symbol entropy difference is always greater than the entropy rate difference. It should also be noted for this choice of input bias b and for larger p, ∆H[Y 0 ] and ∆h µ are close, but they diverge for smaller p. They diverge so much, however, that looking only at single-symbol entropy approximation misses an entire low-p region, highlighted in orange in Fig. 8 and 7, where ∆h µ dips below zero and the ratchet functions as eraser.
The orange-outlined low-p erasure region is particularly interesting, as it hosts a new functionality not previously identified: The ratchet removes multiple-bit uncertainty, effectively erasing incoming bits by adding tem-poral order, all the while increasing the uncertainty in individual incoming bits. The existence of this mode of erasure is highly counterintuitive in light of the fact the Demon interacts with only one bit at a time. In contrast, operation in the erasure region at high p, like that in previous Demons, simply reduces single-bit uncertainty. Moreover, the low-p erasure region lies very close to the region where ratchet functions as an engine, as shown in Fig. 7. As one approaches (p, q) = (0, 0) the eraser and engine regions become arbitrarily close in parameter space. This is a functionally meaningful region, since the device can be easily and efficiently switched between distinct modalities-an eraser or an engine.
In contrast, without knowing the exact entropy rate, it appears that the engine region of the ratchet's parameter space is isolated from the eraser region by a large dud region and that the ratchet is not tunable. Thus, knowing the correlations between bits in the output string allows one to predict additional functionality that otherwise is obscured when one only considers the single-symbol entropy of the output string.
As alluded to above, we can also consider structured input strings generated by memoryful processes, unlike the memoryless biased coin. While correlations in the output string are relevant to the energetic behavior of this ratchet, it turns out that input string correlations are not. The work done by the ratchet depends only on the input's single-symbol bias b. That said, elsewhere we will explore more intelligent ratchets that take advantage of input string correlations to do additional work.

CONCLUSION
Thermodynamic systems that include information reservoirs as well as thermal and work reservoirs are an area of growing interest, driven in many cases by biomolecular chemistry or nanoscale physics and engineering. With the ability to manipulate thermal systems on energy scales closer and closer to the level of thermal fluctuations k B T , information becomes critical to the flow of energy. Our model of a ratchet and a bit string as the information reservoir is very flexible and our methods showed how to analyze a broad class of such controlled thermodynamic systems. Central to identifying thermodynamic functionality was our deriving Eq. (5), based on the control system's Kolmogorov-Sinai entropy, that holds in all situations of memoryful or memoryless ratchets and correlated or uncorrelated input processes and that typically provides the tightest quantitative bound on work. This improvement comes directly from tracking Demon information production over system trajectories, not from time-local, configurational entropies.
Though its perspective and methods were not explicitly highlighted, computational mechanics [65] played a critical role in the foregoing analyses, from its focus on structure and calculating all system component correlations to the technical emphasis on unifilarity in Demon models. Its full impact was not fully explicated here and is left to sequels and sister works. Two complementary computational mechanics analyses of information engines come to mind, in this light. The first is Ref. [16]'s demonstration that the chaotic instability in Szilard's Engine, reconceived as a deterministic dynamical system, is key to its ability to extract heat from a reservoir. This, too, highlights the role of Kolmogorov-Sinai dynamical entropy. Another is the thorough-going extension of fluctuation relations to show how intelligent agents can harvest energy when synchronizing to the fluctuations from a structured environment [61]. This is to say, in effect, the foregoing showed that computational mechanics is a natural framework for analyzing a ratchet interacting with an information reservoir to extract work from a thermal bath. The input and output strings that compose the information reservoir are best described by unifilar HMM generators, since they allow for exact calculation of any informational property of the strings, most importantly the entropy rate. In fact, the control system components are the -machines and -transducers of computational mechanics [62,65].
By allowing one to exactly calculate the asymptotic entropy rate, we identified more functionality in the effective thermodynamic -transducers than previous methods can reveal. Two immediate consequences were that we identified a new kind of thermodynamic eraser and found that our ratchet is easily tunable between an eraser and an engine-functionalities suggesting that real-world ratchets exhibit memory to take advantage of correlated environmental fluctuations, as well as hinting at useful future engineering applications.

ACKNOWLEDGMENTS
We thank M. DeWeese and S. Marzen for useful conversations. As an External Faculty member, JPC thanks the Santa Fe Institute for its hospitality during visits. This work was supported in part by the U. S. Army Research Laboratory and the U. S. Army Research Office under contracts W911NF-13-1-0390 and W911NF-12-1-0234.
Appendix A: Derivation of Eq. (5) Here, we reframe the Second Law of Thermodynamics, deriving an expression of it that makes only one assumption about the information ratchet operating along the bit string: the ratchet accesses only a finite number of internal states. This constraint is rather mild and, thus, the bounds on thermodynamic functioning derived from the new Second Law apply quite broadly.
The original Second Law of Thermodynamics states that the total change in entropy of an isolated system must be nonnegative over any time interval. By considering a system composed of a thermal reservoir, information reservoir, and ratchet, in the following we derive an analog in terms of rates, rather than total configurational entropy changes.
Due to the Second Law, we insist that the change in thermodynamic entropy of the closed system is positive for any number N of time steps. If X denotes the ratchet, Y the bit string, and Z the heat bath, this assumption translates to: To be more precise, this is true over any number of time steps N . If we have our system X, we denote the random variable for its state at time step N by X N . The information reservoir Y is a semi-infinite string. At time zero, the string is composed entirely of the bits of the input process, for which the random variable is denoted Y 0:∞ . The ratchet transduces these inputs, starting with Y 0 and generating the output bit string, the entirety of which is expressed by the random variable Y 0:∞ . At the N th time step, the first N bits of the input Y have been converted into the first N bits of the output Y , so the random variable for the input-output bit string is Y N :∞ ⊗ Y 0:N . Thus, the change in entropy from the initial time to the N th time step is: Note that the internal states of an infinite heat bath do not correlate with the environment, since they have no memory of the environment. This means the mutual informations I[X N , Y N :∞ , Y 0:N ; Z N ] and I[X 0 , Y 0:∞ ; Z 0 ] of the thermal reservoir Z with the bit string Y and ratchet X vanish. Also, note that the change in thermal bath entropy can be expressed in terms of the heat dissipated Q N over the N time steps: Thus, the Second Law naturally separates into energetic terms describing the change in the heat bath and information terms describing the ratchet and bit strings: Since H ≥ 0, we can rewrite this as an entirely general lower bound on the dissipated heat over a length N τ time interval, recalling that τ is the ratchet-bit interaction time: This bound is superficially similar to Eq. (6), but it's true in all cases, as we have not yet made any assumptions about the ratchet. However, its informational quantities are difficult to calculate for large N and, in their current form, do not give much insight. Thus, we look at the infinite-time limit in order tease out hidden properties. Over a time interval N τ , the average heat dissipated per ratchet cycle is Q N /N . When we classify an engine's operation, we usually quantify energy flows that neglect transient dynamics. These are just the heat dissipated per cycle over infinite time Q = lim N →∞ Q N /N , which has the lower bound: Assuming the ratchet has a finite number of internal states, each with finite energy, then the bound can be simplified and written in terms of work. In this case, the average work done is the opposite of the average dissipated heat: W = − Q . And so, it has the upper bound: where the joint entropies are expanded in terms of their single-variable entropies and mutual informations. The entropies over the initial X 0 and final X N ratchet state distributions monitor the change in ratchet memory-time-dependent versions of its statistical complexity C µ (N ) = H[X N ] [65]. This time dependence can be used to monitor how and when the ratchet synchronizes to the incoming sequence, recognizing a sequence's temporal correlations. However, since we assumed that the ratchet has finite states, the ratchet state-entropy and also mutual information terms involving it are bounded above by the logarithm of the number states. And so, they go to zero as N → ∞, leaving the expression: With this, we have a very general upper bound for the work done by the ratchet in terms of just the input and output string variables. Once again, we split the joint entropy term into it's components: In this we identify the output process's entropy rate :∞ ] /N looks unfamiliar, it is actually the negative entropy rate h µ of the input process, so we find that: To understand the mutual information term, note that Y 0:N is generated from Y 0:N , so it is independent of Y N :∞ Thus, we are left with the inequality of Eq. (5): derived with minimal assumptions. Also, the appearance of the statistical complexity and excess entropy, whose contributions this particular derivation shows are asymptotically small, does indicate the potential role of correlations in the input for finite time-times during which the ratchet synchronizes to the incoming information [67].
One key difference between Eq. (A16) (equivalently, Eq. (5)) and the more commonly used bound in Eq. (6), with the change in single-variable con-figurational entropy H[Y 0 ] − H[Y 0 ], is that the former bound is true for all finite ratchets and takes into account the production of information over time via the Kolmogorov-Sinai entropies h µ and h µ . More generally, we do not look at single-step changes in configurational entropies-H[X N −1 , Y N −1 , Z N −1 ] → H[X N , Y N , Z N ]but rather the rate of production of information H[W N | . . . W N −2 , W N −1 ], where W N = (X N , Y N , Z N ). This global dynamical entropy rate has contributions from output rate h µ and input rate h µ . This again indicates how Eq. (6) approximates Eq. (A16).
There are several special cases where the singlevariable bound of Eq. (6) applies. In the case where the input is uncorrelated, it holds, but it is a weaker bound than Eq. (5) using entropy rates. Also, in the case when the ratchet has no internal states and so is memoryless, Eq. (6) is satisfied. Interestingly, either it or Eq. (A16) can be quantitatively stricter in this special case. However, in the most general case where the inputs are correlated and the ratchet has memory, the bound using single-variable entropy is incorrect, since there are cases where it is violated [68]. Finally, when the inputbit-ratchet interaction time τ grows the ratchet spends much time thermalizing. The result is that the output string becomes uncorrelated with the input and so the ratchet is effectively memoryless. Whether by assumption or if it arises as the effective behavior, whenever the ratchet is memoryless, it is ignorant of temporal correlations and so it and the single-symbol entropy bounds are of limited physical import. These issues will be discussed in detail in future works, but as a preview see Ref. [68]. Figure 3 is one of the simplest information transducers for which the outcomes are unifilar for uncorrelated inputs, resulting in the fact that the correlations in the outgoing bits can be explicitly calculated. As this calculation was a primary motivation in our work, we intro-duced the model in Fig. 3 first and, only then, introduced the associated energetic and thermodynamic quantities, as in Fig. 2. The introduction of energetic and thermodynamic quantities for an abstract transducer (as in Fig. 3), however, is not trivial. Given a transducer topology (such as the reverse "Z" shape of the current model), there are multiple possible energy schemes of which only a fraction are consistent with all possible values of the associated transition probabilities. However, more than one scheme is generally possible.

Appendix B: Designing Ratchet Energetics
To show that only a fraction of all possible energetic schemes are consistent with all possible parameter values, consider the case where the interaction energy between the ratchet and a bit is zero, as in Ref. [43]. In our model, this implies β = 0, or equivalently, p = q = 0 (from Eq. (12)). In other words, we cannot describe our model, valid for all values 0 < p, q < 1, by the energy scheme in Fig. 2 with β = 0. This is despite the fact that we have two other independent parameters α and w.
To show that, nonetheless, more than one scheme is possible, imagine the case with α = β = 0. Instead of just one mass, consider three masses such that, whenever the transitions A⊗0 → B⊗0, B⊗0 → A⊗1, and A⊗1 → B⊗ 1 take place, we get works k B T W 1 , k B T W 2 , and k B T W 3 , respectively. We lose the corresponding amounts of work for the reverse transitions. This picture is consistent with the abstract model of Fig. 3 if the following requirements of detailed balance are satisfied: Existence of such an alternative scheme illustrates the fact that given the abstract model of Fig. 3, there is more than one possible consistent energy scheme. We suggest that this will allow for future engineering flexibility.