Re: $674 billion sounds like a lot of money-or is it?

From: John Conover <>
Subject: Re: $674 billion sounds like a lot of money-or is it?
Date: 9 Jan 2003 09:11:58 -0000

There are some interesting intuitive things that happened in the
market crashes of 2000.

Why did the NASDAQ fall harder than the S&P or DJIA?

The NASDAQ carries about twice the risk of the S&P and DJIA, (a daily
deviation of about 2% vs. 1% for the traditional indices.)

If you look at how hard the indices fell, the NASDAQ fell about a
factor of two harder than the traditional indices in 2000-2001, (an
empirical/intuitive observation of the concept that risk has the
deviation as a metric.)

The other side of the coin is that the NASDAQ would increase a factor
of two faster than the S&P or DJIA, too.

It would seem, if an equity's fair market value were determined as the
average/aggregate estimated by rational shareholders, that the value
would be quite precise-any errors in estimates would be averaged
out-but that is quite wrong; in point of fact, the average/aggregate
is frequently a factor of two, (or even ten,) in error.

The formula for the value of an equity, as a function of time, is:

    e^((k1 * t) + (k2 * sqrt (t)))

where a typical value for k1 is 0.0002, and k2 = 0.02, (for the daily
close of equity markets the world over, also similar for currency
exchange rates, the GDP, industrial market growth, and a whole bunch
of other economic things.)

The (k1 * t) term is the long term growth of an equity-it is probably
related, (but that has never been proven,) to macro/microeconomic

The (k2 * sqrt (t)) term is do to speculation; k2 is the daily
deviation of an equity's value-it is probabilistic in nature. The
deviation of an equity's value at some time t in the future will be
this term.

The sum of these two terms are exponentiated to get the value,
(actually, the probability of a value,) of an equity-giving the values
a log-normal distribution, that increases exponentially.

Although it would seem that an equity that seemingly increases in
value exponentially, forever, could never crash, the opposite is
true-all equities will fail, (i.e., become nearly a zero value.)

The reason is the probability term, (k2 * sqrt (t)). Its a question of
how long an equity can have exponential growth, (e^(k1 * t)), before
the laws of probability, e^(k2 * sqrt (t)), catch up with it, and the
negative deviation asserts itself-driving its value to near zero.

That's why, of the 100,000 companies that were listed on the US equity
markets in the Twentieth Century, they were listed for an average of
only 22 years-the theoretical value, using the typical values of K1
and K2 above, is about 20 years.

So, one can kind of see that an equity's value kind of follows its
median value, e^(k1 * t), over time, but has excursions wandering back
and forth across its median, (it is almost never equal to the median.)
The wanderings have a deviation from the median of e^(k2 * sqrt (t)),
which is forever increasing, (until it bumps into values of less than
a buck, and gets booted off the exchange.)


BTW, as a pictographic exercise, using a spread sheet, plot:

    20 * exp ((0.0002 * t) - (0.02 * sqrt (t))
    20 * exp (0.0002 * t)
    20 * exp ((0.0002 * t) + (0.02 * sqrt (t))

for 0 < t < 10000

The first is the negative side deviation, (the value of an equity for
the 40 years will be above this value for about 84% of the time.) An
equity that was at this value at the end of the 40 years would still
be worth only many tens of dollars.

The second is the median value of the equity.  Starting at the
equity's IPO value of $20, it would have increased its value by a
little less than a factor of 10 over the 40 years.

The third is the positive side deviation, (the value of an equity for
the 40 years will be below this value for about 84% of the time.) An
equity that was at this value at the end of the 40 years would have
increased its value by a little more than a factor of 500.

Now, picture the value of the equity wandering around between the
negative side and positive side for 68% of the time, around the
median, in a random fashion-as a conceptual model.

(As a hand waving show and tell, the numbers used are typical for data
on the daily time scale of things economic; look at the values in the
first equation-its 20 * e^0 at t = 10000. The chances are a virtual
certainty that it has already failed; its deviation is equal to its
median-each and every day, it is running a 16% chance of dropping into
oblivion-it can not remain a fugitive from the laws of probability,
forever. Note that at t = 0, the deviation is zero, so it almost a
virtual certainty that it would not fail anytime soon, even though its
median value is the same as at t = 10000. Such is the way with
economic things that evolve into a log-normal distribution.)

John Conover writes:
> BTW, I would really be surprised if the NASDAQ made it back by mid
> 2005. To do that, it would have to gain a factor of 4 in 2 calendar
> years, or 500 trading days. The standard deviation of that would be
> about e^((0.0004 * 500) + (0.02 * sqrt (500))) = 1.91, or about a
> factor of 2.  2 sigma is a 2.27% chance. Not impossible, but not
> likely, either.
> It would be inappropriate to assume that the technology recession will
> be over anytime soon.  Such is the price of the piper when equity
> prices are pumped up in a buying frenzy by a factor of 2.5 over any
> reasonable valuation, (the "new economy" wasn't-its value was as a
> rationalization for PR folks and equity analysts to justify the
> enormous increase in the value of the index, which increased the value
> of the index even more; it was a pyramid scheme.)  Although the
> scenario is not uncommon, it makes it a long way to fall.

John Conover,,

Copyright © 2003 John Conover, All Rights Reserved.
Last modified: Thu Jan 9 01:17:32 PST 2003 $Id: 030109011251.1908.html,v 1.0 2003/01/18 05:12:53 conover Exp $
Valid HTML 4.0!