Please note: This site's design is only visible in a graphical browser that supports Web standards, but its content is accessible to any browser or Internet device. To see this site as it was designed please upgrade to a Web standards compliant browser.
 
Signal vs. Noise

Our book:
Defensive Design for the Web: How To Improve Error Messages, Help, Forms, and Other Crisis Points
Available Now ($16.99)

Most Popular (last 15 days)
Looking for old posts?
37signals Mailing List

Subscribe to our free newsletter and receive updates on 37signals' latest projects, research, announcements, and more (about one email per month).

37signals Services
Syndicate
XML version (full posts)
Get Firefox!

The computer's hidden history

16 Jun 2004 by Ryan Singer

It’s common knowledge that the origin of computers was in mathematics, but what’s less commonly known is that the computer was invented to shed light on a philosophical crisis in the foundations of pure mathematics. Here’s a summary of what happened, with Wikipedia links a’plenty for your perusal.

In the late 19th century, Georg Cantor freaked everyone out with his (theologically inspired) work on infinite sets. Set theory kicked mathematics further up the ladder of abstraction and put its very foundations into question.

In his 2000 talk at Carnegie-Mellon on this subject, G. J. Chaitin said:

[Some people said] Cantor wrecked and ruined mathematics by taking it from being concrete and making it wishy-washy, for example, from hard analysis to abstract analysis. Other people loved this. It was very controversial.

Set theory turned things upside-down, but it also had problems. Bertrand Russell found a paradox, which put the already shaking foundation into a crisis. What if everything’s wrong?!

So David Hilbert came along and suggested that all of mathematics should be expressed in a code with rules — a formalism — to avoid the messy pronouns and ambiguities of normal language. Proofs should be expressed in terms of a formal system, and then one can just go down the line applying rules to see if the proof comes out true or not, like a formula. This was the zenith of the idea that mathematics is black-or-white, and that its truths are absolute.

For thirty years everything seemed to be just fine, but then Kurt Gdel had to rain on the formalism parade by doing one of the most incredible things in the history of ideas. He took Hilbert’s formal system and showed that if the system were to make statements about itself, it would break down. It’s like saying:

This statement is false.

Is it true or false? Hmph. Gdel, with his vastly clever Gdel Numbering, was able to trick Hilbert’s formal system into saying:

This statement is unprovable.

Chaitin said:

Well, if it’s provable, and it says it’s unprovable, we’re proving something that’s false. So that’s not very nice. And if it’s unprovable and it says it’s unprovable, well then, what it states is true, it’s unprovable, and we have a hole. Instead of proving something false we have incompleteness, we have a true statement that our formalization has not succeeded in capturing.
So the idea is that either we’re proving false statements, which is terrifying, or we get something which is not as bad, but is still awful, which is that our formal axiomatic system is incomplete — there’s something that’s true but we can’t prove it within our system. And therefore the goal of formalizing once and for all all of mathematics ends up on the floor!
… You read essays by Hermann Weyl or John von Neumann saying things like this: I became a mathematician because this was my religion, I believed in absolute truth, here was beauty, the real world was awful, but I took refuge in number theory. And all of a sudden Gdel comes and ruins everything, and I want to kill myself!

Ack! This stuff is known as Gdel’s Incompleteness Theorem, and the point is that mathematics isn’t black-or-white. There’s some stuff about which math just can’t make up its mind.

This was devastating in its way, but there was no need to throw the baby out with the bathwater. Hilbert’s idea that proofs could be mechanized into a formal language was still waiting to be tested, and Alan Turing was the man to do it.

Chaitin again:

Turing has to invent the computer, because Hilbert says that there should be a mechanical procedure to decide if a proof is correct or not. Turing says what Hilbert really means is that there should be a computer program for checking proofs. But first Turing has to say what a computer is, it’s a Turing machine, and all of this is in a paper of Turing’s in 1936, when there were no computers.

So Turing actually invented the computer, on paper, to show that Hilbert’s formalization was still useful. The computer proved that Hilbert’s sytem, though incomplete, was decidable. In other words, if there’s an answer to be found, a decision-making procedure can find it.

For perspective, when Turing wrote a program to play chess 16 years later in 1952, he still had to simulate the computer. It took him half an hour to calculate each move while his colleague played (and beat) his “program”.

A cool story in itself, the computer’s history in pure mathematics is also a lesson to all of us to remember that the most abstract and seemingly impractical ideas can change everyone’s lives.

Last week, the 50th anniversary of Turing’s death was recognized. We owe him one.

15 comments so far (Post a Comment)

16 Jun 2004 | JF said...

Yes, Ryan Singer also designs web sites.

16 Jun 2004 | George Vuckovic said...

Computers think in 0,1,2. No, Yes, Maybe. Really.

It's just when they print out binary stuff, they make the font color match the background so that we human's don't know it. Ask any professional printing house. Really.

16 Jun 2004 | waylman said...

So what do you think? Is the statement: "This statement is false." true or false? Go ahead, think about that for a few minutes. Its enough to make anyones head hurt.

Personaly, I have always had trouble seeing how this true/false logic stuff is mathmatics. Sure, I can see the relation between it and math, but how is it math? Of course, without logic, computer code would be nearly imposable. The 2 are certainly intertwinned, but shouldn't mathmatics be mathmatics and logic be logic? I always figured they taught it in math class because they couldn't figure out where else to include it. Then again, maybe I'm missing something.

16 Jun 2004 | Hagbard Celine said...

I always found this myth somewhat comforting - the old Apple logo with rainbow stripes and the missing bite fit Turing's story so well.

Those interested might want to read up on "Maybe Logic" and E-Prime.

Happy Bloomsday!

16 Jun 2004 | mr. mom said...

And I thought "where do babies come from?" was a tough question to answer.

16 Jun 2004 | pb said...

This is the fourth link to Wikipedia I've seen today. Looks like the Wikipedia is moving towards an inflection point. If you haven't checked it out, it's a terrific idea well-implemented. Another cool wiki is world66.com for travel.

17 Jun 2004 | Don Schenck said...

Interesting. My son's college work (*ahem* a Cognitive Science "Fellow" at Penn State, thankyewverymush) involves computer science and philosophy.

Until six months ago, I never thought about computers and philosophy in the same headspace. Now, they're linked.

Cool. And when computers think, they "are".

17 Jun 2004 | Dave said...

A great book to read on the topic of Gdel (not exclusively) is Gdel, Escher, Bach: An Eternal Golden Braid by Douglas Hofstadter. Its a little much sometimes to wrap your head around but it is imminently inspirational. It prompted me to make my personal e-mail signature say, "This sentence is false." I get many questions about it and I just tell people, "just think about what it means for a second." Most people then get uncomfortable, grimace and then move on to something else :)

17 Jun 2004 | One of several Steves said...

And when computers think, they "are".

Only if you believe Descartes, Don. If you're post-rationalist and existentialist, all bets are off. :)

17 Jun 2004 | RS said...

A great book to read on the topic of Gdel (not exclusively) is Gdel, Escher, Bach: An Eternal Golden Braid by Douglas Hofstadter.

Yes yes. If you liked GEB, I highly recommend Metamagical Themas.

For a technical look at some projects that the ideas from GEB have grown into, check out Fluid Concepts and Creative Analogies, by Hofstadter and his research group. Daniel Dennett wrote a nice review of it.

17 Jun 2004 | KillAllDash9 said...

So what do you think? Is the statement: "This statement is false." true or false? Go ahead, think about that for a few minutes. Its enough to make anyones head hurt.

I suppose it's not so difficult if you think about it like a programmer. Of course, it really depends on what 'is' is, as they say. If you think of 'is' as '=', then the statement can be thought of as

'This statement' = false

If 'This statement' refers to the statement 'This statement is false.', and 'This statement' = false, then the statement is true, because (false == false) evaluates to true, logically.

17 Jun 2004 | KillAllDash9 said...

A bit more on my thoughts above:

Think of it like a recursive acronym. If false is 0 and true is 1, then you might have a progression as such:

'This statement is false'

'This statement' = 0

'This statement is false' = 0

'This statement' = 0 = 0

{ad infinum}

You basically end up with a never-ending string of 0 = 0 = 0 ...

The beauty of it is that -- no matter how many '= 0's you add to the line -- the entire statement will always evaluate to true.

18 Jun 2004 | RS said...

KillAllDash9: Not so. You need to step out a level.

If "This statement is false" is false, then "This statement is false" is true, but if "This statement is false" is true, then "This statement is false" is false, and so on. There is no solution.

29 Jun 2004 | James Barni said...

I get many questions about it and I just tell people, "just think about what it means for a second." Most people then get uncomfortable, grimace and then move on to something else

Comments on this post are closed

 
Back to Top ^