Saturday, October 16, 2010

Zurek's improvement of the definition of entropy

"A potentially important application of algorithmic complexity to physics was proposed by Wojtek Żurek of the Los Alamos National Laboratory in New Mexico. In order to rid Boltzmann's definition of entropy of its troublesome element of subjectivity, Żurek suggested an almost imperceptible modification of it. Recall that entropy is a measure of missing information about a system. It therefore depends on what an observer happens to know: a smarter being has more information, is missing less, and thus assigns a lower entropy to a system than a more limited creature. To render entropy more objective, Żurek recommended adding a measure of recorded information to that of missing information. The sum of the two remains constant --if you remove data from one column, it reappears in the other. The observant creature thus becomes redundant; only the entries in its notebook or computer memory matter.

But how to access the amount of recorded information? Żurek chose algorithmic complexity as the most natural measure. Accordingly, his new, improved entropy consists of two portions: the conventional entropy as measured by the formula on Boltzmann's tomb, plus a piece that is normally inconceivably tiny, and accounts for the algorithmic complexity of the listing of recorded knowledge about the system. A mathematical description of the size and shape of a vessel containing a gas might be a typical item in the list, while missing information includes the coordinates of a vast number of atoms. Notice that in the hypothetical case that every position and every velocity of every atom is known, the Boltzmann entropy of the system is zero, but the added term --the length of the description of what's known, in binary code-- will be huge, bringing the total entropy back to its previous value. After a hundred years the reek of subjectivity has finally been lifted from the Second Law of Thermodynamics.

In spite of its cogency, Żurek's improved entropy has not gained much support.
Hans Christian von Baeyer, Information, the new language of science, Chapter 12

Simple. It reminds me of the kinetic and potential energy in adiabatic mechanical system, in which the sum of the two stays constant at all times. I also remember vaguely that the history of the total energy followed the same trajectory than that of entropy; only one portion of the energy was first defined and only when the second portion was defined, the energy became a constant of the system and starting to be accepted with its components, as valid quantities.