Lab Home | Phone | Search | ||||||||
|
||||||||
Biological cells have an elaborate machinery for probing their external environment. They gather information about the many small molecules diffusing around them, whether nutrients, or hormones, or toxic agents. This information is conveyed by protein enzymes that add or remove chemical tags from other proteins. But biological circuits are extremely error-prone, distorting the signal through the stochastic nature of protein interactions under constant thermal agitation. So the cell expends enormous amounts of energy on error-correction mechanisms, maintaining the fidelity of the signal. My talk shows how we can understand this energy consumption through the language of thermodynamics, building on recent advances in nonequilibrium statistical physics and information theory. Cellular signaling networks essentially act as "information engines", converting energy from chemical potential reservoirs into useful "work" in the form of accurate information transfer. And just like the second law of thermodynamics constrains the efficiency of mechanical engines, analogous relationships hold for biochemical signaling, putting bounds on the minimum error that can be achieved. We look at the tradeoffs between efficiency and power, and the necessity of these biological circuits operating in a highly dissipative, nonequilibrium regime. We also discuss how cells optimize error-correction, implementing noise filtering schemes known from engineered communications systems. Host: Sebastian Deffner |