[RU]

Energy and Information

Originally, physics is about things that happen without any human intervention. Though we often trigger some physical events to do something practical, the physical part of it still does not depend on our intention, and we need to adjust our expectations to the objective laws rather than force nature to behave as we expect.

On the contrary, information exchange is all about human communication. Though it may seem to happen regardless of the human will in the electronic devices and computer networks, the purpose and primary source of data exchange is always related to some human needs, albeit in an indirect manner. Without this cultural background, one would merely have random physical processes in the interconnected circuits, but no information flow, unless the circuits develop a kind of society of their own, a robotic culture. As one of my colleagues put it, "Beats me. Why do we calculate heat production in the server room using the total power supply value? With all those heaps of information processed every moment and stored on disk, shouldn't we spend at least some energy on computing proper?" Still, from the physical viewpoint, computing is nothing but a complicated way of heat production. Storing either a "0" or a "1" in a memory unit takes exactly the same energy, and any meaningful interpretation of the resulting physical structures are beyond the computer. This is like a mechanical balance being entirely indifferent to what exactly we are going to weigh, a nugget of gold or a morsel of food.

Information is the deliberate order of things rather than in things themselves. The same physical state can carry a valuable message and become meaningless with the last possible interpreter gone. Similarly, the letters in this text may convey my ideas to another sentient being, but they will remain a void occurrence, a random sequence, if there is nobody to share or oppose this particular viewpoint.

However, as long as we keep aloof from any mystical speculations, we must admit that arranging things to support informative communication requires some physical effort, and hence it will eventually dissipate some energy. No information exchange is possible in a conservative system assuming no heat production. That is, to put something in order, we need to bring disorder elsewhere. In thermodynamics, this idea is reflected in the concept of entropy, and physicists agree that total entropy can never decrease in an isolated physical system. Considering this circumstance, one is tempted to interpret any changes in physical entropy as communication thus introducing information as a physical property. This is a trivial logical fallacy: if information transfer requires entropy flows, it does not imply that any entropy variations at all are due to information exchange. On the same footing, one might directly relate muscular strength to the quantity of food.

Under certain conditions, entropy can serve as a measure of information; in other cases, some alternative (for instance, financial) measures might be more appropriate. Reverting the logic, one could introduce a kind of entropy on the basis of the chosen information measure and develop a thermodynamic picture of communication and information processing. The commonly known Penrose's fiction of quantum consciousness thus tried to reduce psychology to physics. A sober-minded approach would take such scheme transfers for what they are, mere metaphors—sometimes useful, sometimes not. Within physics, one encounters numerous metaphors of the same kind, like negative temperatures, or backward motion in time; everybody knows that such figurative expressions refer to regular physical phenomena that can as well be described in a more rigorous (though possibly more complicated) way, with the temperatures always remaining positive, and time preserving its intuitive monotony.

It is important that quantitative measures of information are not necessarily what we really need; there are situations when a qualitative description would be much more accurate. Thus, we can discuss the moral aspects of communication, characterizing a message as misinformation or deliberate lie. We can mention the timeliness or completeness of information, or speak about its integrity and acceptability. There is no need to numerically assess anything unless there is a socially established scale accounting for subtle gradations. In many practical cases, numerical estimates remain mere labels that could be equivalently replaced by common words or pictograms expressing the same in a more obvious and easily comprehensible way. It is a standard principle in experimental science that, to avoid spurious regularities, the precision of calculations should not exceed the precision of the initial data; the same principle must certainly be respected in theory as well, meaning, in particular, that we shouldn't give numerical answers to qualitative questions, for all the quantitative part would be utterly excessive in that case, provoking an illusion of knowledge more precise than it really is.

Returning to the necessity of physical motion for human communication, one could wonder if there is a lower limit to the quantity of energy required to enable information transfer. The existence of such a threshold seems to be rather plausible, especially in the view of the quantum world, with all its discrete-spectrum and resonance phenomena. However, recalling that any discrete features in quantum physics are always embedded in a continuum (as there are no absolutely stable states in a compound system, and the maximum we can get is just long enough lifetimes, possibly comparable with the age of the Universe), we could admit as well that any energy threshold for informative communication was due to the character of constraints imposed on the communication channel rather than an inherent property of communication as such.

For instance, in quantum computing, the commonly accepted Landauer principle states that the processing of 1 bit of information implies energy dissipation of at least kTlog2, with k being the Boltzmann constant and T denoting the absolute temperature. That is, with huge data flow in modern computers, this minimum energy consumption, however small, becomes quite measurable. Today, we are already approaching the level of efficiency where this "physical" limitation would start to play.

This reasoning, again, hides a few logical pitfalls. If we treat a computer as a physical system, all we can expect is heat production, without any relation to information transfer. The Landauer formula refers to an elementary quantum transition, a kind of electron spin projection sign flop. One could call such a system a binary switch and discuss its physical properties in length. Even in this physical picture, the presence of the lower limit is entirely due to the statistical nature of the process, assuming an ensemble of "free" binary switches that do not interact with each other or with their physical environment. In a strongly correlated system, the energy cost of an elementary flop may be lowered (this is usually referred to as "coherence", or "entanglement"). The appearance of the Boltzmann constant in the Landauer formula is in no way accidental; it implies physical constraints "selecting" a specific physical model.

On the other hand, computer operation does not need to have anything in common with human communication and information exchange. This is (basically) a physical process synchronizing the states of many binary switches (usually far from being elementary). Of course, on the physical level, it does not matter, what the computer is exactly doing; the same profile of heat (and entropy) production could be obtained in many ways. To come to computing proper, we need to leave physics in the background and consider the "logical" level of the same system, which is a quite different science. A physical system does not care whether this or that particular state of a binary switch stands for "0" or "1"; the ideas of "bits", "bytes", "words" etc. are entirely alien to physics. That is, when Landauer says that the erasure of one bit needs certain energy, this is an eclectic mix of two different statements, the first referring to the states of a physical system (a binary switch), and the second admitting that these states can be interpreted in terms of computer operations. This interpretation is a special activity, based on a different physics, on a different energy scale. That is, somebody (or something) must read the current state of a binary switch and trigger a number of child processes depending on the result. For a classical computer, the energies involved in interpretation and commutation are negligible compared to those needed for switch operation. In quantum computing, interpretation may significantly interfere with switching, thus introducing quantum logic, in addition to quantum physics.

Of course, there is no need in a human interpreter. A whole hierarchy of abstract active media is provided, for example, by the OSI model. The physical implementation of such agents can largely vary; in this sense, a crystal, an organic molecule, or a living cell could be metaphorically pictured as a kind of computer. The traditional communication theory is applicable on this level, including any quantitative information measures. However, to be precise, there is no information proper, since nobody informs anybody in this computer world. Computers do not process information; they process data, the patterns of the physical states. The transition from the physical to logical level is therefore analogous to the transition from the dynamic to statistical description in physics, drastically reducing the number of the degrees of freedom; this resemblance provoked the conceptual confusion identifying the quantity of information with negative entropy, but these are two different kinds of statistics that cannot be reduced to each other.

Now, let us turn to information in its original sense, as a meaningful message passed in human communication from one person to another. Mere data transfer is not enough; both data and the very act of their transfer must be motivated within some common activity establishing the context of communication. The same physical process, with the same logical interpretation in terms of data processing, can be either informative, or not, depending on the cultural conditions. Is there any lower limit for the energies involved? Both yes and no. Yes, since the material basis of any particular culture is limited by the current, historically established way of production, including a definite level of physical reality. In this culture, we just cannot detect too subtle changes in the state of the world, and hence the existence of a physical threshold in human communication. However, with technological development, the range of observable physical phenomena will necessarily expand, and the economic role of crude force will tend to zero (without ever actually reaching it). The energy levels in a complex correlated system are much closer spaced than in any of its elementary components, with less energy dissipation required for informative communication. As the upper limit for a system's complexity is that of the whole world, there is no absolute threshold for communication energy, and any cultural limitations are to be lifted in the future, in due course. Metaphorically using the Landauer formula, one could observe that, with technological development, information exchange tends to occur at lower temperatures, gradually approaching the absolute zero. The physical meaning of this metaphor can be explained as dominating of controlled behavior over stochastic processes in well-cultivated nature. Indeed, any dynamics obeying some deterministic equations of motion with deterministic boundary conditions could be said to occur at zero temperature; in this sense, human intentionality serves to introduce order in chaos, and that is exactly what informative communication does.

In real life, the situation is somewhat more complicated. Any hierarchy can produce different hierarchical structures (hierarchical conversion), and there is no rigid distinction between any adjacent levels. That is, some processes on the logical level can become mere physical events, and a portion of conscious behavior can eventually be delegated to various "computers". Conversely, physical systems can develop certain "logical" functionality, while computers may someday form a new kind of society, to complement the human culture. Still, the overall picture remains the same in any particular case, and the principal difference of physical motion from computing is to be clearly understood, as well the distinction of the both from information exchange.


[Physics] [Science] [Unism]