Abstract

Physical systems powering motion and creating structure in a fixed amount of time dissipate energy and produce entropy. Whether living, synthetic or engineered, systems performing these dynamic functions must balance dissipation and speed. Here, we show that rates of energy and entropy exchange are subject to a speed limit-a time-information uncertainty relation-imposed by the rates of change in the information content of the system. This uncertainty relation bounds the time that elapses before the change in a thermodynamic quantity has the same magnitude as its s.d. From this general bound, we establish a family of speed limits for heat, dissipated/chemical work and entropy depending on the experimental constraints on the system and its environment. In all of these inequalities, the timescale of transient dynamical fluctuations is universally bounded by the Fisher information. Moreover, they all have a mathematical form that mirrors the Mandelstam-Tamm version of the time-energy uncertainty relation in quantum mechanics. These bounds on the speed of arbitrary observables apply to transient systems away from thermodynamic equilibrium, independent of the physical constraints on the stochastic dynamics or their function. A time-information uncertainty relation in thermodynamics has been derived, analogous to the time-energy uncertainty relation in quantum mechanics, imposing limits on the speed of energy and entropy exchange between a system and external reservoirs.

Publication Details
Publication Type
Journal Article
Year of Publication
2020
Volume
16
DOI
10.1038/s41567-020-0981-y
Journal
Nature Physics
Contributors
Groups