Friday, November 30, 2007

An unlikely place for a confrontation between universalism and complexity

I found a very sensitive illustration of the confrontation between universalism/reductionism and complexity/relativism in a quite unlikely place. André Gorz writes in Lettre à D., Histoire d'un amour about the different way his wife, an englishwoman, and himself, influenced by French universalism, used to think:

"J'avais besoin de théorie pour structurer ma pensée et t'objectais qu'une pensée non structurée menace toujours de sombrer dans l'empirisme et l'insignifiance. Tu répondais que la théorie menace toujours de devenir un carcan qui interdit de percevoir la complexité mouvante du réel."
How lucid this statement is!

See also this post about the same confrontation.

Why writing and about what?

André Gorz writes in Lettre à D., Histoire d'un amour that what matters is writing, not what is being written. The topic, the subject are only the raw products for the writing process to work. They are only secondary to the act of writing:

"Ce n'est pas ce qu'il écrit qui est le but premier de l'écrivain. Son besoin premier est d'écrire. Ecrire, c'est-à-dire se faire absent du monde et de lui-même pour, éventuellement, en faire la matière d'élaborations littéraires. Ce n'est que secondairement que se pose la question du «sujet» traité. Le sujet est la condition nécessairement contingente de la production d'écrits. N'importe quel sujet est le bon pourvu qu'il permette d'écrire."
The writer becomes a novelist once what is being written starts to organize itself into a well-defined project:
"L'écriveur deviendra écrivain quand son besoin d'écrire sera soutenu par un sujet qui permet et exige que ce besoin s'organise en projet."
Sounds easy!

Gödel's theorem

Here is a short and naive description of Gödel's theorem that I draw from R. Penrose, The emperor's new mind. From what I understand, if you lay down precisely all axioms necessary to define a logical system, you will find a contradiction very similar to the following statement:

this sentence is false.
This infinite indecision, similar to the image reflected onto two facing mirrors, is one of the driving argument in The moment of complexity by M. C. Taylor.

One important consequence of Gödel's theorem is that you find, again within the axioms and rules you have yourself constructed, propositions that are true even if you cannot find any proof of it within the system. Said more formally
"What Gödel showed was that any such precise ('formal') mathematical system of axioms and rules of procedure whatever, provided that it is broad enough to contain descriptions of simple arithmetical propositions [...] and provided that it is free from contradiction, must contain some statements which are neither provable nor disprovable by the means allowed within the system."
Thus even when playing with a mathematical system, will we find that 'truth' has not an absolute meaning and even R. Penrose conceives that
"[m]athematical truth is something that goes beyond mere formalism."

Monday, November 19, 2007

Some comments on entropy

Definition

Since I reviewed the basic teachings concerning entropy, I see entropy everywhere. The most helpful definition of entropy is the one regarding the state of order. The highest the order, the lowest the entropy. Thus, I declare myself an enemy of entropy, in the sense that I, we, always try to create some order, to put things in order: putting the plates away in the kitchen, entangling an electric wire, etc. Why is it actually easier to put things in disorder and to create entropy than the reverse? One useful explanation to comprehend this difference is to see that a state in order is an improbable state, while a state in disorder is a probable one: the electric wire is more likely to be tangled after so many years than to stay untangled, and there are many ways to put a mess in a room, but only one to put things at their places.

Entropy and time

There is some controversy concerning the relationship between entropy and time. The problem is that the fundamental laws of classical physics (from Newton) are reversible in time; that is whatever happens in one direction (toward to the future) can very well happen in the other direction (toward the past). Thus, as we see a drop of milk in a tea cup spreading and diffusing throughout the volume, we should see all the milk particles to come back and form the initial drop of milk. This is indeed possible according to Poincaré's recurrence theorem, although, because the state of the drop is very unlikely compared to all the states where the milk is spread, the probability that this happens is tiny (but in theory, it could happen!).

On the other hand, the second law of thermodynamics says that for a closed system, the entropy has to increase: the spreading of the drop of milk within the cup is a perfect example of entropy increase. Some, such as Prigogine, argues that entropy carries with itself the so-called arrow of time: because entropy increases, we can make the difference between past and future. But the question then remained, is the second law compatible to the reversible laws of classical physics? Roger Penrose, in his book The emperor's new mind, argues that the second law is not only compatible, but also, contrary to Prigogine's view, that the entropy does not carry the arrow of time with it. Whatever the direction, toward the past or the future, the entropy has to increase within a closed system, in particular within a system where there is no constraint on the entropy. In the case of the drop of milk, although toward the future there is no constraint and the entropy increases indeed, toward the past, there is the constraint of the initial conditions saying that the entropy is low at the beginning: thus, if you run the experiment backwards, there is the constraint that at the end, the entropy is lower than at the beginning. Because of this constraint, the second law does not apply as such and in consequence, the entropy does not itself carry the arrow of time. The arrow of time exists because our system started with a state of low entropy. The remaining question is thus, why and how did we start with a state of low entropy?

Source of low entropy

Both L. Botlzmann and R. Penrose describe the struggle for life as a struggle for low entropy, with the ultimate source of low entropy being the sun. L. Botlzmann writes

"The general struggle for existence of animate beings is therefore not a struggle for raw materials [...] nor energy [...], but a struggle for entropy, which becomes available through the transition of energy from the hot sun to the cold earth."

and R. Penrose says
"We do not need to gain energy from our environment because energy is conserved. But we are continually fighting against the second law of thermodynamics. Entropy is not conserved; it is increasing all the time. To keep ourselves alive, we need to keep lowering the entropy that is within ourselves."

Thus, how do we get this low entropy? The ultimate source of entropy is the sun, and plants are the organisms which are using directly this entropy source, transforming it into molecular structures, themselves ready to be eaten. We, humans, via the food web, are eating plants or animals who themselves eat plants, to get low entropy for our body.

In this aspect, I am then wondering if we can class the food web in terms of entropy content. The plants would get a source of low entropy Si, some of it would be used such that the entropy content gained in the eaten plant would be actually larger, Si < Splant . This process would repeat in that the higher in the food web, the higher the entropy content. In that respect, I would conclude that 1) humans would be organisms with some of the highest entropy (the most disorder) and 2) we should all be vegetarians in order to efficiently get low entropy in our diet. Do you agree with these conclusions?

Why the sun?

R. Penrose also explains why the sun is a source of low entropy and I was very surprised to learn that the reason is nearly a geometrical one. The sun is a hot spot, a small disk of light compared to the entire sky. Because of this geometrical configuration, the energy we receive from the sun has a much lower entropy that the energy sent back to space by earth because this energy is sent into all directions. Thus, if I understand correctly, if the earth was surrounded by many suns so much so that they would cover the entire sky, there would not be any source of low entropy and life would be unable to exist? Of course, one still needs to explain why the sun, itself a compact star and thus a source of low entropy, exists but the explanation goes on with cosmological arguments that I understand much less.

References
Ludwig Boltzmann, The second law of thermodynamics, in Theoretical physics and philosophical problems
Jean Bricmont, Science of chaos or chaos in science?
Roger Penrose, The emperor's new mind

Monday, November 5, 2007

Boltzmann on reason and passion

Read what have to say great scientists, historical figures or artists on contemporary problems, popular culture or on the little things in life can be either worrisome, such as the recent comment by the co-discoverer of DNA saying that black people are less intelligent, or charming such as L. Boltzmann, the great scientist from the late 19th century, commenting on our everyday struggle between reason and passion.

Far from defending a pure rational behavior, L. Boltzmann not only accepts the human weakness of reacting instinctively but also recognizes that at times instinct can very much enlight people's life. L. Boltzmann writes:

"How far removed we are from pure rational grounds being the motives of all our actions! The innermost impulses to action mostly still arise from innate drives and passions, that is from instincts germinating within us without our concurrence, which do indeed become harmful and reprehensible if dominating the intellect, but nevertheless are necessary to lend our actions liveliness and our character its peculiar colouring. The machinery of the world maintains itself, as Schiller says, «today, as ever, by hunger and love, and the time is as yet far off when philosophy will hold the universal circuit together»."
Ludwig Boltzmann, On the principles of mechanics, in Theoretical physics and philosophical problems

Such observation has been made over and over during History. For instance, during the Thirty Years War, Sweden chancellor expressed it as followed:
"Nesci, mi fili, quantilla ratione mundus regatur"
"you don't know, my dear boy, with what little reason the world is governed"
cited by Ludwig von Bertalanffy in General system theory.