forwarded message from William R. Goodin

From: John Conover <john@email.johncon.com>
Subject: forwarded message from William R. Goodin
Date: Fri, 8 Nov 1996 22:30:14 -0800


If you are interested in adaptive computational systems, then the
attached may be of significance to you. (FYI, programmed trading,
etc., are adaptive control systems.) BTW, Melanie Mitchell is a bit
more than a Research Professor, she is the Director of Adaptive
Control Systems at SFI. The attached came from one of the economics
conferences.

        John

FWIW, after the success of the programmed traders, there is a lot of
research going on in adaptive systems. The reason is quite
subtile. Consider, if you will, a decision, for example, on whether to
invest in a stock. Your decision will be based on whether you expect
the stock's prices to increase, (duh!) But to evaluate that
expectation, you will be basing your decision on what others in the
market are going to do. The trouble is that they, also, are basing
their decision on what the others in the market are going to do-and
that means that you are basing your decision on what they are going to
do-including you.  Bottom line, it is a self referential system. The
logician Kurt Godel, in 1928, pop'ed a Nobel for showing that in self
referential systems, there can be no "theory," or hypothesis, of how
the system works that is not incomplete, or inconsistent, or both.

In the 1980's, the field of economics went through a trial and
tribulation, (it was about time, the field of physics and engineering
went through it at the turn of the century, when they included
relativistic principles into the quantum mechanics, a la, Einstein and
Bohr, for the same reasons of the self referential stuff,) and most of
deterministic, reductionist theories of economics have been either
revamped or scraped. One of the significant contributions of Brian
Arthur was to show, (with two Soviet scientists-I can't remember their
name,) that in such scenarios, inductive thinking, (as opposed to
deductive theoretical modeling a la classical physics,) is the "logic"
system of choice. That being the case, a lot of the methodologies of
the quantum mechanics can be drug into economics. In the late 80's,
several economists, (ring leader B. Arthur,) showed that although
there can never be a complete or consistent theory of economic
systems, these types of systems do behave in a "prescribed"
manner. And the prescription is fractal, (references are numerous-see
any modern economics text-if you think about it, it is
elementary. Subtile, but elementary. A system that does not operate by
any deducible theory must be fractal, or degenerate case thereof. by
definition.) One of the problems is that fractal dynamics are hard to
handle, since any time you figure out what it is doing, it changes,
(if that were not the case, then it would not be self referential.)
So, what do you do?  You make a system that adapts. That's what.

And, in case you are curious, these systems include such issues in the
market as cognitive, perceptional, inconsistency, nervousness, etc.,
phenomena in economic situations, since anytime inconsistency is
involved, as a function of time, and the market is the sum total,
(ie., integral, or cumulative sum of all these cognitions,
perceptions, inconsistencies, etc.,) the result is a fractal, since by
definition, a fractal is an integral of a random process. Showing that
economic things lock into a fractal process, (ie., a fractal is one,
of possibly many, stable scenarios for the process,) is difficult, but
can be done-as it was in the fundamental theory of programmed
trading. BTW, if you think about it, it has to be that way. In any
speculative investment, as long a the general belief is that one can
do better by altering an investment strategy, (like dumping one stock,
to try your luck on another,) the situation will be a stable
fractal. Kind of a self fulfilling prophecy. If you believe it, it
will be that way. And so, it would be a stable solution. (By contrast,
a market that oscillated would not, since the peaks and valleys would
be arbitraged away.) The key is that the market must be self
referential, and so be never understandable. In addition, the market
must be subject to price arbitration over time by inductive
speculation, which changes over time. Additionally, there must be many
agents operating in the market. And that makes the market dynamics
fractal, (which is a function that is everywhere continuous, and
everywhere non-differentiable-look at the graph of any stock's price
for an really good example.)

Now you know why adaptive computation is of interest.

------- start of forwarded message (RFC 934 encapsulation) -------
Path: netcom.com!www.nntp.primenet.com!nntp.primenet.com!news.mathworks.com!newsgate.duke.edu!walras.econ.duke.edu!dlj
Newsgroups: sci.econ.research
Organization: UCLA Extension
Lines: 54
Approved: -dlj.
Message-ID: <560h0d$4ad@newsgate.duke.edu>
NNTP-Posting-Host: walras.econ.duke.edu
Originator: dlj@walras.econ.duke.edu
From: BGOODIN@UNEX.UCLA.EDU (William R. Goodin)
To: John Conover <john@email.johncon.com>
Subject: Commercial Announcement: UCLA course on "Evolutionary Computation"
Date: Fri, 8 Nov 1996 10:08:34

On February 19-21, 1997, UCLA Extension will present the short course,
"Evolutionary Computation: Principles and Applications", on the UCLA
campus in Los Angeles.

The instructors are Melanie Mitchell, PhD, Research Professor, Santa Fe
Institute; Richard Belew, PhD, Associate Professor, Computer Science,
UC San Diego; Lawrence Davis, PhD, President, Tica Associates; and
Una-May Davis, PhD, Research Fellow, AI Laboratory, MIT.

Each participant receives a copy of the book, " An Introduction to Genetic
Algorithms", M. Mitchell (MIT Press 1996), and extensive course notes.

This course introduces engineers, scientists, and other interested
participants to the burgeoning field of evolutionary computation.
Evolutionary computation--genetic algorithms, evolution strategies,
evolutionary programming, and genetic programming--is a collection of
computational techniques, inspired by biological evolution, to enhance
optimization, design, and machine learning.  Such techniques are
increasingly used to great advantage in applications as diverse as
aeronautical design, factory scheduling, bioengineering, electronic circuit
design, telecommunications network configuration, and robotic control.

Four of the leading experts in this field present the fundamentals of
evolutionary computation which should enable participants to write their
own evolutionary computation applications.  The course includes detailed
descriptions of many applications, as well as how to design genetic
algorithms and other methods for problems of interest to the participants.
Comparisons of genetic algorithms with other search and learning methods
are discussed in the context of the example applications.

The last day focuses on identifying promising areas for genetic algorithm
optimization, and creating a genetic algorithm that performs well on your
optimization problems.

Course participants who wish to present a problem on the last day are
encouraged to contact Dr. Davis (davis@tica.com; phone [617] 864-2292)
prior to the course to determine its usefulness as an example.  The
instructors hope to use two examples to illustrate the points made on the
final day.

The course fee is $1395, which includes extensive course materials.
These materials are for participants only, and are not for sale.

For a more information and a complete course description, please contact
Marcus Hennessy at:

(310) 825-1047
(310) 206-2815  fax
mhenness@unex.ucla.edu
http://www.unex.ucla.edu/shortcourses

This course may also be presented on-site at company locations.

------- end -------
--

John Conover, john@email.johncon.com, http://www.johncon.com/


Copyright © 1996 John Conover, john@email.johncon.com. All Rights Reserved.
Last modified: Fri Mar 26 18:55:37 PST 1999 $Id: 961108223028.5774.html,v 1.0 2001/11/17 23:05:50 conover Exp $
Valid HTML 4.0!