Monday, October 8, 2007

A definition of complexity

Murray Gell-Mann gives a definition based on information theory of complexity in his book The quark and the jaguar. In this context, a measure of complexity can be associated with the actual length (in bits for instance) of the description of the regularities in a message.

If the message is perfectly regular, that is composed of only 1's, then the length of this regularity is short, "only 1's", and the complexity is small. On the other hand, if the message is perfectly random, so much so there is no regularity whatsoever, then the complexity is also low. Only when there is many different regularities together, the complexity will be large.

Although this definition might not satisfy a lot of people, at least it is the first time that I read any attempt to quantitatively define complexity.

No comments: