Re: The Standard: Warning: Recession Ahead

From: John Conover <john@email.johncon.com>
Subject: Re: The Standard: Warning: Recession Ahead
Date: 6 May 2001 20:36:53 -0000



Looking at the valuation of a well managed company, (i.e., one where
management knows what its doing-knows what has to be done, and knows
how to execute on it, optimally, e.g., grow 2X per year, for at least
10 years, before it starts its demise,) and if it starts with a
million bucks of GR the first year, (a reasonable value,) then IPOs in
year 5, (with a GR = 2^5 * 10^6 = 32 million-a reasonable number from
the historical perspective,) then in year 10 it would have a GR of 2^5
* 32^6 = one billion bucks.

If the company was floated on an initial million bucks investment,
(i.e., it was not profitable only in the first year,) it would be a
thousand X increase in valuation over the decade, (assuming that the
capitalization vs. GR remained constant, which was 3X in the 20'th
century, averaged over all companies-I'm using a conservative 1X.)
Compare that with the 100X the best dot-coms delivered, (and then
didn't.)

Now assume that management is less efficient. It grows the company at
only 10 percent less than optimal, at 1.9X per year-everything else
remaining the same. At the end of the decade, the company's GR would
be 613 million, or its valuation/capitalization would be almost 40%
less. About half, for a consistent 10% inefficiency in management.

Suppose the management of the company putzed around and didn't execute
on engaging the marketplace for a year, then ran optimally. At the end
of the decade, the company's GR would be 512 billion bucks, or its
valuation/capitalization would be about half, for a 10% delay in
execution.

So, roughly, every delay or inefficiency by management of 10%
ultimately costs the company 50% of its valuation/capitalization.

        John

BTW, its not a linear relationship. Another inefficiency/delay of 10%
doesn't cost another half billion bucks again. It cuts it in about
half again.

Note that a delay of one year costs a 50% market share in a
competitive environment, so the market would shake out, ultimately, at
a 2:1 market share ratio between a well run company, and a not so well
run one, who has a 30% market share. In other words, a loss of 20%
market share would spell the ultimate demise as the cost for the
management mistake. It wouldn't make it through the decade.

John Conover writes:
> While we are on the subject of the dot-coms, (which is an interesting
> case study-it is/was capitalism and free-market'ism at its finest-no
> barrier to entry, insignificant infrastructure requirements, ample
> investment or access to capital, and a choice of thousands for
> consumers-what more could anyone want?) and completely mismanaged by
> our finest young executives with their recent MBAs in tote. (Although
> entropic methodologies are taught as a core competency requirement in
> all university financial curriculum, it is not taught in business
> school.)
>
> As an example, what is the chance that a company with 1% market share
> will ever replace an industry leader with 50%?
>
> The answer is 2%, (you just divide the two numbers-its the gambler's
> ultimate ruin scenario-and if it is an entropic system, it has to be
> that way.)
>
> What that means is that, as an investor, one has to expect returns of
> at least 50X to justify investing in companies with a 1% market share,
> no matter how good the concept or vision is.
>
> We would, also, expect about a 1 in 10 survival rate, (e.g., any
> company that has less than 10% market share, e.g., a 90% chance of
> failure, won't make it,) and we are seeing those numbers.
>
> What's the duration of time one would expect for such a company to
> be in business?
>
> About 50 * (50 - 10) = 2,000 days, or about 7.9 years for 253 business
> days per year. (And, we have seen that, too; many of the dot-coms that
> are failing were started as early as 1994, or so.)
>
> That's why so many dot-coms are failing.
>
> It is called management mistakes.
>
>         John
>
> BTW, compare this to a company with competent executive staff that put
> the company in biz early in the industry's development, started with,
> and maintained a 50% market share. Such a company would be expected to
> last about 100 * (100 - 50) = 5,000 days = 19.7 years, (at 253
> business days per year.)  The empirical metrics on the history of US
> commerce in the 20'th century says 22 about years, (i.e., about a 10%
> error in the probability estimate.)
>
> Only GE survived the entire century, BTW.
>
> But nothing is eternal in entropic systems. What's the chance of a
> company in a leadership position eventually failing?
>
> Its a virtual certainty. That's the way capitalism works.
>
> John Conover writes:
> >
> > As approximate numbers, (taken from the average of many industries on
> > http://www.johncon.com/ndustrix/,) to exploit the lack of
> > predictability in the US GDP and its constituent industrial markets,
> > the short term predictability of a few months to a 70% accuracy means
> > that about (2 * 0.7) - 1 = 40% of a company's assets should be at risk
> > in the short term, in WIP, capital expenditures, R&D, etc.
> >
> > Note that it is a conceptual framework, and just says what has to be
> > done-not how to do it.  The implementational paradigm is evaluation of
> > "what if we do?", and "what if we don't?" scenarios, with more
> > attention paid to the latter, (since you get more for better
> > mitigation of risk than picking winners.)
> >
> > In the long term, (i.e., more than a few months,) mitigate risk
> > through product diversification, (no less than 8 product lines, 10
> > being about right, 12 probably too many,) and shuffle investment
> > capital around in the product lines to avoid leptokurtotic behavior in
> > the corporate P&L, (i.e., as a first order approximation, the GR
> > generated by each product line should be about equal as a management
> > paradigm; as a better approximation, the GR generated by each product
> > should be proportional to the average divided by the deviation of the
> > marginal increments in the GR. Likewise for the investment of
> > capital.)  That, also, defines investment in new product areas. The
> > long term plan is to put about 2% of the company at risk, per business
> > day, (or, cumulative, about 40% per month,) which is consistent with
> > the short term strategy.
> >
> > The numbers would support a corporate growth of slightly more than 2X
> > a year, and would maximize growth, while at the same time, minimizing
> > risk exposure.
> >
> > Whether such growth could be managed and executed in the long run is
> > an entirely different issue. (Its tough.)
> >
> >         John
> >
> > BTW, there are other solutions for the same numbers-but they aren't
> > optimally maximal. It is possible to grow a company faster than 2X a
> > year-for awhile. Its a "coffin corner" solution, that will exhibit
> > high growth, followed by a crash, (no matter what growth metric is
> > used.) About 2X is the maximum SUSTAINABLE growth with those numbers.
> > (Does the dot-com industry ring a bell as a case in point?  Many of
> > those companies exhibited growth of 2X in a single month-for
> > awhile. The amount of the company placed at risk was far larger than
> > the optimal 40% per month-most of it in capital investment. Glory has
> > its price.)
> >
> > John Conover writes:
> > > Interestingly, the lack of predictability in the US GDP is
> > > exploitable.
> > >
> > > As a mathematical expediency most corporate strategies are divided
> > > into short term and long term, (short term being a few months, long
> > > term being more.)
> > >
> > > For long term strategies, the size of the window of predictability is
> > > regarded as zero-meaning that the US GDP is treated as an entropic
> > > system that behaves randomly. Strategies are developed that fit into a
> > > framework determined by the average, (the potential gain,) and the
> > > standard deviation, (the risk,) of the marginal increments of the US
> > > GDP or industrial market-usually using monthly or quarterly data.
> > >
> > > Short term predictability is an inefficiency, (at least in the sense
> > > of the efficient market hypothesis,) and can be exploited by adjusting
> > > operations to near term GDP/market anticipations faster than anyone
> > > else, by using the like of JIT techniques to minimize WIP risk, etc.
> > >
> > > The potential advantage is quite significant.
> > >
> > >         John
> > >
> > > BTW, what would happen if every company that contributes to the US GDP
> > > did that? Not much, except that it would grow faster. The US GDP would
> > > become entirely entropic, (i.e., unpredictable,) and would be
> > > efficient, and fair. It would be stable, (in the sense that it could
> > > exist like that, forever.) Further, if the average of the marginal
> > > increments equaled the variance, it would be maximally optimal.
> > >
> > > Unfortunately, it is a politically inexpedient solution. Such concepts
> > > as monetary/fiscal policy to affect consistency in full employment,
> > > etc., (which has been the paradigm of the last seven decades in all
> > > industrialized countries,) would have to be discarded. Perceived lack
> > > of influence over economic issues is not an expedient political
> > > platform.
> > >
> > > (BTW, full employment is not necessarily optimal. It has, in general,
> > > an unsustainable cost. Maximal sustainable employment is when the
> > > GDP's average of its marginal increments equals the variance. However,
> > > maximal employment does not, necessarily, mean full employment.)
> > >
> > > John Conover writes:
> > > >
> > > > In case you are curious, the big US economic recessions since
> > > > Independence happened in 1819, 1833, 1837, 1857, 1873, 1893,
> > > > 1929, (using the GNP/GDP numbers, which are not necessarily
> > > > coincident with the stock market numbers, as far as downturns
> > > > go,) and, (possibly-we don't know yet,) 2000.
> > > >
> > > > Note that this is the first generation in US history that has not
> > > > had to endure a famine/depression, (at least yet,) and our
> > > > perspective does not include how ugly they really are.
> > > >
> > > >       John
> > > >
> > > > BTW, the numbers are interesting. To make predictions-like in the
> > > > attached-the US GDP must be a deterministic system. Finding a
> > > > mechanism that gives zero-free paths representing those numbers is
> > > > a formidable proposition. If that can't be done, then, predictions
> > > > can not be made.
> > > >
> > > > In NLDS systems, (of which the US GDP is certainly one,) the
> > > > influence of the past on the future deteriorates rapidly-meaning that
> > > > a small window into the past can be used to predict a small window
> > > > into the future, and that is the best that can be done. The size of
> > > > the window for the US GDP, is, at best, a few months, to a 70%, or so,
> > > > accuracy.
> > > >
> > > > Unfortunately, the prevailing wisdom is that fiscal/monetary policy
> > > > can not be used to influence the fluctuations in the US GDP-which
> > > > was the paradigm of the past seven decades, and has since been
> > > > abandoned.
> > > >
> > > > http://www.thestandard.com/article/0,1902,24243,00.html

--

John Conover, john@email.johncon.com, http://www.johncon.com/


Copyright © 2001 John Conover, john@email.johncon.com. All Rights Reserved.
Last modified: Sun May 6 17:29:44 PDT 2001 $Id: 010506133704.17593.html,v 1.0 2001/11/17 23:05:50 conover Exp $
Valid HTML 4.0!