forwarded message from Melanie Mitchell

From: John Conover <john@email.johncon.com>
Subject: forwarded message from Melanie Mitchell
Date: Mon, 15 Jul 1996 12:11:09 -0700


BTW, many "industrial grade thinkers" feel that information-theoretic
concepts will provide the next large advances in science-among them,
Rudy Rucker and John Casti.

At the beginning of this century, it was thought that the axiomatic
systems, (what most folks call science-another way of saying that we
can figure out the set of rules by which the universe, or any other
system, operates,) had no limitations, other than getting enough
folks, with enough time and budgets, to figure out the rules-at least
in principle.  Then, in 1928, the young logician, Kurt Godel, proved
that, in principle, you can't-ie., that the nature of logic itself is
that, when applied to a sufficiently complicated system, the set of
rules must forever be incomplete, or inconsistent, or both. The
problem of inconsistencies and incompleteness had been a "skeleton in
the closet" for mathematicians since the time of Euclid, (and, yes,
despite what your high school teacher told you, the axioms of geometry
are inconsistent-the 5th, concerning the sum of the angles of a
triangle is an example-and Euclid knew it, but didn't know what to do
about it.)

How "complicated" does a system have to be before we have to be
concerned with these incomplete/inconsistent issues. Well, that is a
good question, and as an example, (falsity by proof of exception, to
be exact,) Godel's paper addressed the arithmetic. It turns out that
addition can be proven to be complete and consistent, but subtraction
can not. (It can be proven, but not within the context of the
arithmetic-we have to drag in the concept of infinitesimal, to prove
subtraction always works, and that means dragging in the concept of
the limit, which has more incompleteness than the arithmetic-so in
some sense, we are loosing an information-theoretic battle. Ie., to
prove the concept of the limit to be consistent, we have to drag in
the theory of topology-and we have more incompleteness in the theory
of topology, than both the arithmetic and limit, combined.)

What had happened was that Godel had kicked over a "can of worms" that
placed limitations on what science itself can do-blasphemy at the turn
of this century. (An interesting interpretation of this was made by
Casti when he said that mathematics is just another religion-it is
unique among the religion in that it can prove itself a religion.) As
it turns out, it was only the "tip of the iceberg." In the 1960's,
Gregory Chaitin showed that not only were there isolated instances of
incompleteness, but that almost all theories about anything must be
incomplete-and introduced the science of information-by proving that
we can never get enough information about anything to make a complete
theory. (This is why physicists, for example in the quantum
mechanics-the "queen" of physical theories, take a very narrow,
general, abstract, and imprecise view of the rule set that defines how
nature operates, and permit not so well defined models of the
universe.) In point of fact, the concept of entropy and uncertainty
have been shown to be information-theoretic concepts. (Entropy is a
concept, from Boltzman, that no one liked very much for the simple
reason that it measures how much the universe has run down and is not
a physical entity-it is used by mechanical engineers to design
automobile engines. The uncertainty principle places a limit on what
we can find out from the universe-ie., we can never know both an
electron's momentum and position precisely-allowing chance, or
randomness, in the universe. The electron may, or may not, work in a
deterministic fashion, but we will never be able to find out which.)

As Rudy Rucker (logician at San Jose State, and author of some really
fine science fiction,) has stated, "the laws of logic are pitifully
few." (Most logicians, BTW, feel, epistemologically, that the rule
that an axiom can not be both true and false at the same time is too
"strong" a requirement in logic-but know one knows how to relax it-and
still maintain the usefulness of logic.)

Since the discovery of the limitations of science,
information-theoretic concepts have provided a means by which to
analyze and optimize systems which contain a complex random
variability, like stock markets, the weather, etc., and is a firmly
entrenched discipline in the field of economics. What the attached
lecture is about is the application of "entropic principles," (which
is what information theory is all about,) to the organization of
social structures do to perception of the individuals in the society,
(note the theme-a macroscopic model of the social realm can be
formulated from the microscopic behavior of its constituent parts-the
"byline" of complexity theory.) Note what was just stated-a
*_causality_* can be shown to exist between the macroscopic and
microscopic behaviors of a complex social system, ie., it uses
statistical mechanics to provide a context for scientific induction.
(And, although there are a lot of statistics done, very little is used
to provide a context for induction.)

Let me leave you with an example of a simple set of axioms with a
Godelian contradiction, (actually from Roger Penrose, the
mathematician that worked with Stephen Hawkings, at Cambridge.)
Consider a library of books.  The librarian notices that some books
contain their title, and some don't. It would be of benefit that two
books be added to the library; one that contains a list of books that
contain their titles; and another book that contains a list of books
that do not contain their titles.  Now, clearly, a book either
contains its title, or it doesn't, in which case the book would be
added to the list in first book, or the second book, respectively.  To
make a long story short, in which book do we add the title of the
second book that was added to the library? (If you add it to the
first, then it doesn't contain its title, in which case it can not be
added to the first-if you add it to the second, then it does contain
its name, and can not be added to the second ...)

Note that the issue is the way that information is carried between the
microscopic and macroscopic "concepts" of the library. All of the
axioms listed above are consistent and complete, when taken
singularly, but when taken together, are inconsistent and/or
incomplete, (or both.)

Note the impact of information-theoretic concepts and inconsistency on
such things as social systems, legal systems, accounting
methodologies, etc., ie., if we can not catalog the books in a
library, how can we set up a legal system, or accounting system,
etc. ...

(BTW, the original US Constitution did so-none other than Godel, his
self, verified that there were no contradictions-and it appears that
the architects of the Constitution were well aware of Godelian issues
when the document was written. It seems that one B. Franklin was the
"logician," and an early draft had T. Jefferson's wording changed to
"we hold these truths to be self evident," which made the document
logically complete and consistent. Since then, beginning with the
Ammendment concerning Congress's authority to change the Constitution,
inconsistency has been the rule, and not the exception-ie., does the
authority to change the Constitution include itself? If it does, it
doesn't. And if it doesn't, it does. And what does information theory
say about such things? Well, it can not be done in a bottom up
construct, ie., there can never be enough "microscopic" laws made such
that the "macroscopic" law of the land will be complete and
consistent. It must be a top down construct-like the original authors
used. However, think of it as a "works project" for Congress, as one
cynical information theorist said.)

--

John Conover, john@email.johncon.com, http://www.johncon.com/


Copyright © 1996 John Conover, john@email.johncon.com. All Rights Reserved.
Last modified: Fri Mar 26 18:56:32 PST 1999 $Id: 960715121209.13485.html,v 1.0 2001/11/17 23:05:50 conover Exp $
Valid HTML 4.0!