forwarded message from

From: John Conover <>
Subject: forwarded message from
Date: Thu, 14 Nov 1996 01:29:23 -0800

Interesting. Re: my discussion of things Godelian earlier. Today is
the anniversary of the death of Leibnitz. As you know, along with
Newton, Leibnitz is credited with developing the calculus. Leibnitz
was a bit more philosophical than Newton, (actually, Newton practiced
science during the day, and alchemy in the evenings, retiring to his
laboratory and rubbing stones together in the pursuit of making gold.)

One of Leibnitz's prevailing epistemologies was that the deductive
logic methodologies could prevail in human affairs, intervening in
irrational discussions, so that truth would inevitably be decidable,
in a rational, logical manner.

The concept did prevail well into this century, and was represented by
none other than the most famous mathematician of the 20'th century,
David Hilbert. Hilbert had decided, at the turn of the century, that
there were 10 essential "problems" that had to be resolved in logic
and mathematics to make Leibnitz's concepts a reality. He presented
them at the 1900 annual meeting of the mathematical society. For the
next twenty years, the likes of Bertrand Russell, John Von Neumann,
etc., worked diligently to solve the problems, and make logic complete
and consistent.

In 1928 Kurt Godel proved them wrong.


BTW, it is now generally accepted that whether a set of axioms, (ie.,
a theory,) is true or false can not be decided in all cases. In point
of fact, it can be shown that in most cases it can't. Undecidability
is nature of logic itself. (Whether there are questions that exist
that are undecidably undecidable is, at the present, a conjecture.)

Kurt Godel was a pretty sharp cookie, BTW. His proof is very clever,
and very simple. He just represented axioms and postulates by symbols,
and concatenated them into strings. If a symbol appeared more than
once in a string, then the logic system was recursive, ie.,
self-referential, and would be either incomplete or inconsistent, or
both, forever. Very trick thinking, that is still the corner stone of
the application of information theory to such things. So, like
Leibnitz, et al, believed, if it was possible to eliminate recursion,
(actually, the issue was known by the ancient Greeks,) then all
inconsistency could be removed from a logic system, like
mathematics. The trouble is that if you do that, mathematics, ie., the
language of logic, degenerates into something that, although it is
complete and consistent, is not very useful. For example, you would
have to throw out subtraction from the arithmetic, which would be kind
of a hindrance in such a simple process as counting things.

Then, in the agenda of mathematics in the 20'th century, Hilbert, et
al, back peddled on their position, circa 1930, and assumed that,
although some things in logic were just simply not decidable, many
were, and the techniques used by Godel could be used to prove which
ones were. In 1937, Alan Turing proved them wrong-again with a very
clever concept that attacked the logic of the arithmetic. His concept
was to devise a theoretical machine that could do so, if any machine
could, and in the process invented the computer, on which you are
probably reading this. He then showed that such a machine could not
compute which things were decidable, and which were not.  He also
defined the difference between calculating and
computing. (Technically, a spreadsheet is not computing-it is
calculating. And an automated calculator is not a computer. As an
interesting side bar, the modern computer is the only machine known
that was designed as a theoretical abstraction, and then
implemented. All others were implemented, and then the theoretical
abstractions worked out.)

The ages have not been nice to the folks involved in the drama of the
logical consistency of mathematics. Newton never made gold. Leibnitz,
after falling on hard financial times, died a pauper in Poland-the
only person attending the funeral was his personal secretary. Godel
went mad, and spent his last days concerned that people were trying to
poison his food, and Turing committed suicide during allegations of a
homosexual scandal. Hilbert went to his grave never believing that
Godel was correct, but was never able to prove otherwise. His career
was not productive after the publication of Godel's proof. Russel's
"Principa Mathematica," the anticipated volume of complete and
consistent mathematics and logic was unfinished at his death, many
decades later. Von Neumann went into economics, then into the National
Laboratory system doing research in computation, expanding on Turing's
concepts. He died relatively young, in his early 50's.

The 20'th century has been kind of an anti-science, as the drama of
the limitations of deductive logic played itself out.

------- start of forwarded message (RFC 934 encapsulation) -------
Received: (from root@localhost) by (8.6.12/8.6.12) id AAA01462 for john; Thu, 14 Nov 1996 00:05:17 -0800
Message-Id: <>
From: root <>
Subject: Reminders for Thursday, November 14, 1996
Date: Thu, 14 Nov 1996 00:05:17 -0800

Reminders for Thursday, 14th November, 1996 (today):

              Sunrise 06:46, Sunset 16:58, Moon 0.22 (Increasing)

________________________ On This Day, Nov 14 ... ________________________

Gottfried Wilhelm Leibnitz, German mathematician and philosopher, died (1716)
------- end -------

John Conover,,

Copyright © 1996 John Conover, All Rights Reserved.
Last modified: Fri Mar 26 18:55:33 PST 1999 $Id: 961114013020.1914.html,v 1.0 2001/11/17 23:05:50 conover Exp $
Valid HTML 4.0!