Complexity, Randomness and Impossible Tasks

ByABC News
November 1, 2004, 9:56 AM

Nov. 7, 2004 -- -- Some things are simple, some are complicated. What makes them so? In fields ranging from biology to physics to computer science, we're often faced with the question of how to measure complexity.

There are many possible measures, but one of the most important is algorithmic complexity. First described by two mathematicians, the Russian Andrei Kolmogorov and the American Gregory Chaitin, it has been extensively developed by Chaitin in recent years.

The flavor of the subject can perhaps be sampled by considering this question: Why is it that the first sequence of 0's and 1's below is termed orderly or patterned and the second sequence random or patternless? (Note that since almost everything from DNA to symphonies to this very column can be encoded into 0's and 1's, this is not as specialized a question as it may at first appear.)

(A) 0010010010010010010010010010010010010010010

(B) 1000101101101100010101100101111010010111010

Answering this question leads not only to the definition of algorithmic complexity, but also to a better understanding of (a type of) randomness as well as a proof of the famous incompleteness theorem first proved by the Austrian mathematician Kurt Godel.

Hang on. The ride's going to be bumpy, but the view will be bracing.

With sequences like those above in mind, Chaitin defined the complexity of a sequence of 0's and 1's to be the length of the shortest computer program that will generate the sequence.

Let's assume that both sequences continue on and have lengths of 1 billion bits (0's and 1's). A program that generates sequence A will be essentially the following succinct recipe: print two 0's, then a 1, and repeat this x times. If we write the program itself in the language of 0's and 1's, it will be quite short compared to the length of the sequence it generates. Thus sequence A, despite its billion-bit length, has a complexity of, let's say, only 10,000 bits.