Re: News: Technology Stocks Had A Bad November

From: John Conover <john@email.johncon.com>
Subject: Re: News: Technology Stocks Had A Bad November
Date: 29 Dec 2000 21:48:52 -0000




Well, the NASDAQ had to finish up 83 today to avoid year 2000 having
the dubious distinction of being the worst year of its 30 year
history. It didn't make it, and is down about 50% from its highs of
the year, and about 35% on the calendar year.

The NASDAQ lost about 50% of its value since mid March, (about a 180
trading days,) and the chances of that happening has a standard
deviation of about 0.02 * sqrt (180) = 0.268, or about 25%; 50% would
be about two standard deviations which is 0.027, or about 3%, or about
once in 30 years, (the NASDAQ is 29 years old-started in 1971.)

All the indices turned in negative numbers for the calendar year,
(which from the attached e-mail should be about a once a decade chance
if there is correlation in the indices-it works out; the last time
this happened was 1990, and before that, 1981.)

So, what is the prognostication for the future?

The current bear market "bubble" started in about the first quarter of
1999, (the S&P and DJIA haven't really moved since then,) and half of
the bear markets last less than 4.3 years, (and half more,) so there
is about a 50/50 chance that the current bear market will bottom in
about early 2001, with the indices being back on track by about late
2002. During that two year interval, there is about a 50/50 chance
that the indices will turn in about twice their average growth, (which
is a little under 10% per year,) or there is about a 50/50 chance that
the increase in the value of the indices will be about 20% for the
years 2001 and 2002, or so.

But such things are not all that rosy-these types of probability
functions have very sluggish tails; for example, there is, also, a 1 /
sqrt (25) = 20% chance that the current bear market will last at least
as long, (but not as deep,) as the bear market of the Great
Depression-25 years.

So, how does one use such information?

The chances of 2001 being a bull year is about 1 / sqrt (3) = 0.577,
or about 58%; and one should place about 2P - 1 = 2 * 0.58 - 1 = 0.16,
or about 15% of one's wealth at risk on the chance that it will be a
bull year.

For 2002, its about a 1 / sqrt (2) = 0.707, or about a 70% chance for
a bull year; so, one should place about 2 * 0.7 - 1 = 0.4, or about
40% of one's wealth should be placed at risk on the chance that it
will be a bull year.

        John

BTW, note what is happening; since the markets have been negative,
(i.e., less than their average 10% growth,) for about two years,
(e.g. one should not have been invested-another of mathematics most
profound insights,) money is being moved back into market,
gradually. One doesn't want to "wager" everything, since there is a
significant probability that the bear market will last much longer
than four years, and all would be lost. However, investing nothing
would preclude taking advantage of the average anticipated gain of 20%
in the indices per year for the next two years, too.  Obviously, the
optimal fraction of wealth to put at risk lies between these two
limits, and the "magic" number is 2P - 1, if one wants to attempt to
exploit "bubbles."

John Conover writes:
>
> What's the chances of any arbitrage system, (like an equity market,)
> plummeting 22% in a month of 20 trading days?
>
> You measure the risk, (e.g., the deviation, which is the square root
> of the variance of the fluctuations,) and multiply that by the
> square root of the number of days. The deviation for the NASDAQ is
> about 2%, (meaning that for 1 sigma = 68% of the time, the
> fluctuations are less than 2%,) so, the standard deviation of the
> fluctuation measured on a time scale of 20 days would be 0.02 *
> sqrt (20) = 9%, or so, (meaning that for 68% of the time, the
> fluctuations on a time scale of 20 days would be less than 9%).
>
> Its that fractal stuff.
>
> So, 22% / 9% = 2.44 sigma, which has a probability, (using the
> normal probability tables, or your handy dandy calculator,) of
> about 0.008, or about once in 125 months, or about 10 years
> between such things, on average.
>
> And just when we had decided that it was a new economy.
>
>       John
>
> BTW, it works out about right. The last such rogue month was
> in 1987, and there were 9 of them in the 20'th century.
>
> http://www.computeruser.com/news/00/12/07/news13.html

--

John Conover, john@email.johncon.com, http://www.johncon.com/


Copyright © 2000 John Conover, john@email.johncon.com. All Rights Reserved.
Last modified: Fri Dec 29 22:24:33 PST 2000 $Id: 001229134920.13590.html,v 1.0 2001/11/17 23:05:50 conover Exp $
Valid HTML 4.0!