Re: General Equilibrium Model

From: John Conover <john@email.johncon.com>
Subject: Re: General Equilibrium Model
Date: 29 Dec 1998 21:58:53 GMT


Donald L. Libby writes:
> jim blair wrote:
> >
> > G o e t z K l u g e wrote:
> > >
> > > Markets at equilibrium and dynamic changes within these
> > > markets are no contradiction.
> > >
> > > Roughly speaking, a system is in equilibrium with regard to
> > > a variable, as long as the average of this variable does not
> > > change.
> > >
> >
> > So what economic variables have not changed their long term averages?
>
> The DEFINITION of "long run" is that short-run variables are constant.
>

I kind of struggled through the numbers, using the DJIA data as an
example. I used the daily closes for the last century as the data
set. In very rough numbers, the statistics are that the day-to-day
likelihood of an up movement is 51%, (and the chances of a down
movement are 49%.) The volatility, (ie., the root mean square of the
day-do-day marginal growth,) is about 1%. The average of the
day-to-day marginal growth is about 0.02%.

There are two issues. The first is the measurement of a long term
average growth of 0.02% per day that is rattling around at 1%, rms,
per day. If it is desired that the accuracy of the measurement is 1%,
then standard statistical estimation gives a data set size of 12,100
trading days, or about 48 calendar years.

The second issue concerns the chances that the data set interval
measured was predominately up, or down, by serendipity of the choice
of the interval. (The durations of the up and down fluctuations,
assuming a random walk model, have a frequency distribution of erf (1
/ sqrt (t)) which for t > 1 is about 1 / sqrt (t), ie., it has large
swings, above or below the long term average, that can last for
literally decades.) Again, if a 1% accuracy is chosen as an arbitrary
requirement, the measurement interval is 13,890 trading days, or about
55 years.

Taken together, for the 1% arbitrary accuracy requirement, the data
set size jumps to 33,000 days, or about 130 years.

What's the point? Many do not conceptualize what the data set size
requirements are to determine the value of "long run" variables with
reasonable accuracy in random walk models. Note that it not
necessarily the data set size, per se, but data over a large enough
interval that is the issue-the 1 / sqrt (t) frequency distribution
function decays rapidly-at first-then has long tails that decay very
slowly to small values.

        John

--

John Conover, john@email.johncon.com, http://www.johncon.com/


Copyright © 1998 John Conover, john@email.johncon.com. All Rights Reserved.
Last modified: Fri Mar 26 18:53:24 PST 1999 $Id: 981229141029.22716.html,v 1.0 2001/11/17 23:05:50 conover Exp $
Valid HTML 4.0!