no subject (file transmission)

From: John Conover <john@email.johncon.com>
Subject: no subject (file transmission)
Date: Fri, 19 Nov 1993 19:01:03 -0800 (PST)



Attached pls. find a group of 7 articles that were written by myself,
some time ago, to establish a direction for IT in 3 participating
companies (these were the original discussion articles that were to
evolve into the various IT organizational charters/directions.)

The general philosophy being presented was to start by using the Unix
MTA as a carrier for electronic correspondence, and a corporate wide
full text database retrieval system to manage/query the electronic
correspondences, across the functional organizations, for
administration and project management. (The retrieval system began
with a simple egrep type of system operating through the MUA, evolved
into the program "QT," and was finally replaced with WAIS.) The
audience was assumed to be general, non-technical, and have only a
limited perception of what modern computing is about.

The difference between content data and context data was outlined
(which implies, although I did not state it explicitly, that the
management paradigm/perception must change from content management to
context management.)  Although not specifically cited (I didn't want
to frighten anyone,) anthropological, eg., social, issues were
addressed in an indirect manner. The relationship of IT to other
quality (TQM, etc.)  contemporary management schemas was discussed.

I wouldn't advise printing this document, it is about 35 pages
long-half of which is the annotated bibliography and a more complete
bibtex bibliographic database (with about 150 references.)

Although these articles are not confidential, I would appreciate if
they were not distributed openly. You may cut and hack them into other
documents, if you are so inclined, however.

        -JCC

**********************************************************************

Let me explain what I am doing with the archive/retrieve/info programs
that are being developed. I am attempting to automate a lot of project
administration/execution. Let me tell you why. I will cite the failure
of IT at General Motors, (IBM is a similar failure,) as a reason to
not do it the way they did.

In the mid 80's, GM decided that IT was the way to go. It purchased
EDS from R. Perot, made Perot a bigshot manager, and put a terminal on
all of the middle managers desks (all 18 levels of them.) After an
outlay for direct expenses totaling billions of dollars, and untold
fortunes in indirect costs (training, lost data because some pressed
the wrong button learning to use the system, etc.) nothing really
changed. The org. was still a clumsy, bumbling giant that could not
get out of its own way. This is a classic example of how not to
automate admin. If you stand back and look at why the GM
implementation was a failure, you can see that the problem was that
although they automated memos (and did away with paper) the decision
process that determined the organization's agenda was really the
issue-and they did nothing to enhance the efficiency of that. (A key
failure.)

Let me elaborate. Consider that the IT system at GM reduced the time
of a memo to be routed from 2 days to seconds. But the action required
to address the issues in the memo still required months ... well you
get the picture. The GM admin. was/is a marvel of efficiency. Look at
it this way. (The numbers quoted above are real numbers, BTW.) 60 days
divided by 18 levels of management, means that each level responded to
the issues addressed in a memo in a little over 3 days, including week
ends and holidays-very impressive (GM was proud of these numbers.)

The question is, how do you speed up things, and still maintain order
and control of agenda in a large organization.  Obviously, the number
of levels of management must be reduced. But if you eliminate some of
the levels of management, who is going to control/administer things?
Who is going to enforce the discipline of a consistent and continuous
agenda?  Won't important things fall into cracks?  The answer is yes.
(BTW, this number of levels of organization is justified at GM, they
actually, really do, need this many levels to coordinate things.) Note
that the flow of information in management was not the issue (GM's
failure proved that.) GM addressed the wrong issue, and obviously,
other organizations, irregardless of their size, have the same issues
to address.

IT must address the issues of agenda/project execution, and the
decision making process (as opposed to being an information provider.)
This is an important concept. Let me elaborate on this.

The traditional reason for memos is the formal documentation of the
decision making process-who decided what and for what reasons.(Memos
have nothing to do with communication-they are inferior to the
telephone-in all respects-except documentation.) The reasons that we
document the process is for accountability, so we can hold the
decision makers responsible for their decisions-and make sure that
the decisions are executed appropriately. Or to put it in other words,
really what we are doing is preserving a paper posterity, or history,
that we can do a literature search on in the future, to put into
perspective, the organization's agenda. But this can be done
electronically (which is a key point.) It is the organizational agenda
that is the issue, and NOT the information that flowed in the
execution of the agenda.  It is a subtle difference.

The way that you put into perspective what is going on in an
organization, is to search the communication flow for context (ie.,
agenda,) and not for content (ie., information.) This is a key point.

If we make a corporate archive (perhaps distributed throughout the
organization,) that contains pertinate information on all the
decisions that are addressed (ie., an email database archive) in the
organization (ie., we store the information,) AND, we permit realtime
automated literature searches of the database, (ie., we can put the
information into context,) then we can do the same thing-just with
incredible efficiency. (BTW, by context literature search, I mean a
full text retrieval system-one that can be cross-indexed, and collated
on demand.)

The archive/retrieve system that is being developed is such a system.
In point of fact, if you use the archive/retrieve system on machine
"john" you can see the thinking that lead to me writing this. (As a
simple example send an email to retrieve@john.johncon.com, with a
Subject: retrieve my_project.) Kind of neat when you think about it.

**********************************************************************

John Conover writes:
>
>
> Let me explain what I am doing with the archive/retrieve/info programs
> that are being developed...
>

Allow me continue with an application of IT, specifically to
program/project management, ie., a "how to" application. This will
make use of the email system to hold asynchronous group conferences,
and the archive/retrieval system to document the activities of the
individuals in the group (or team, if you prefer.) Or in other words,
the email system will be used to decide/communicate what is going to
be done, who is going to do it, and when. The archive/retrieval system
will be used to document what was done, who did it, and when it was
finished. It will exploit the self documenting capabilities of email,
to establish accountability and responsibilities for the activities
addressed by the group. (BTW, what we call email is, technically, not
an email system at all-technically it is an "asynchronous electronic
conferencing system," or AECS, for the record.)

Let me first address the semantics, to avoid confusion.  There are two
essential semantic definitions that concern us. The first is the
difference between supervision and management. The other is the
relation between information, knowledge, and capability.

The difference between supervision and management is that supervision
is the administration of pre-defined activities (usually handled by a
supervisor or foreman.) Management, on the other hand, is the
intellectual process of deciding/defining what activities are to be
executed (and in what order, etc.) Obviously, intellectual processes
are closely related to information/knowledge/capability issues.

Knowledge is information in context. Knowledge in context is
capability (capability being defined as knowing what to do.)
Capability is the the objective. This is a key point.  For example,
corporate databases (technically "content databases,") can provide
information on the status of the company, but they can not provide
insight as to why the status is the way that it is-that is left to
interpreter. To do that, you have to put the information in context,
or perspective. Once you have the established the perspective (ie.,
gained knowledge,) then you can proceed to decide what must be done,
and how to do it (ie., establish an agenda, or a capability.)

Although this application has an electronic database, it differs from
content databases in that the context, (or acquisition of
perspective,) phases of interpretation are automatically documented
and, therefore, subject to scrutiny (through context framed
searches)-ie., it automates the documentation of the decision making
process. Note that this directly relates to OD and TQM issues.

Although the concept is consistent with contemporary management
principles, a word of caution is in order. The system documents
appropriate and inappropriate decisions with equal impunity. It is
important to realize that it is not a question of whether a decision
was "bad" or not, but whether it was made intelligently (ie., the
"quality" of the decision is the issue.) Since this is a "perspective"
issue, it is important that all decisions be rationalized within the
system so that context searches can be initiated to discover the
perspective under which the decision was made. Maturity and
objectivity is order when judging the appropriateness of a decision.
In an ideal sense, the database contains a collection of the "point of
views" (POV) of the participants. The system "brings together" these
different view points so that management can reconcile appropriate
tactics and strategies (ie., activities) that make up an agenda.

It is important to understand that the essence of the system is that
it documents the process that a decision was arrived at, and not the
decision itself (although it does that too.) It is equally important
to understand that discipline must be enforced in submitting all of the
justifications/rationalizations for all decisions into the system. It
is a key point. Bear in mind that the system is not for the timid or
weak. When you submit a POV to the system, you are going on record
as saying "in my opinion, we should consider..."

With these issues in mind, I will present "how to" implement and use
such a system:

        1) Management decrees that the system would be used as a
        management tool, and all data has to be entered, or
        transcribed into the system (including the minutes of
        meetings, etc.) If it doesn't exist in the system, it does not
        exist. All discussions, and reasons for decisions have to be
        placed in the system. ALL team members and upper management
        (across functions) have identical access to ALL transactions.
        (Mail can be used for private correspondence, such as
        politicking, etc. but all decisions, and the reasons for the
        decisions have to be placed in the system.) The guiding rule
        is that at the end of the project, the system will contain a
        complete play by play chronology and history of all decisions,
        and reasoning concerning the project, and, by the way, who was
        responsible for the decisions. On each Monday, everyone enters
        into the system, his/her objectives for the week, and when
        each objective was finished, she/he mails the milestone into
        the system-ie., all group members and management can thus find
        out the exact status of the project at any time (ie., a
        "social contract" was made with management and the rest of the
        members of the team.) At any time, a discussion can be
        initiated on problems/decisions in the system by anyone. The
        project manager is assigned the responsibility of "moderator,"
        or chair person for his/her section of the system. Each
        Friday, the system is queried for project status, and the
        status plumbed to a text formatter, and printed for official
        documentation. This document was discussed at a late Friday
        people-to-people staff meeting. (Note that in some sense, it
        is similar to a very fast, dynamic, MBO scheme.)

        2) Marketing is responsible for acquiring all pertinate data
        on magnetic media, (from services like Data Quest, the
        Department of Commerce, etc.) and each document is "mailed"
        into the system so that the information is available to
        everyone.  They have access to the progress made by
        engineering, and can contribute information on pertinate
        issues as the program develops-ie., this is a "concurrent
        engineering" environment.

        3) Engineering is responsible for maintaining schedules, and
        reflecting those schedules in the system-if slippages occur,
        then the situation can be addressed immediately by management,
        and a suitable cross functional resolution can be arrived at.

and so on for the other teams involved in the project.

**********************************************************************

John Conover writes:
> Allow me continue with an application of IT, specifically to
> program/project management, ie., a "how to" application. This will
> make use of the email system to hold asynchronous group conferences,
> and the archive/retrieval system to document the activities of the
> individuals in the group (or team, if you prefer.) ...
>

Continuing on with the application of IT, I will present an example
usage scenario in the day to day operation of the IT system outlined
above.  To reiterate the way things are setup, all members of the
group (or team) have a mail alias set up in their project account that
includes their self, all other members of the group, and the group's
project archive. (It would be appropriate to think of this as an
ongoing electronic conference.) As things are discussed, decided,
scheduled, and finished, an email concerning any and all issues of the
activity will be distributed to all members of the group, and the
archive. (Don't forget that these electronic correspondences
constitute a "documented social commitment," on the part of the
individuals of the group, to the group, itself.) It would be
appropriate to think of the archive as a "library," where you do
electronic "literature searches" in the conference digests, for
various subjects concerning the project.

Take for example, project scheduling and tracking. The group chairman
(ie., project leader, team leader, group administrator, or whoever has
been assigned to be in charge of scheduling,) issues an email,
periodically (say, every Friday,) with a "Subject: Project Status
Milestones," (or whatever else) to all members of the group. As all
members of the group reply about the week's progress, (simply by
pressing the 'r' key in the standard Unix mail system, and typing the
response) the person in charge of the scheduling system can update the
master project status (either by a "cutting and pasting" the sections
out of the returned project status reports into a formal paper report,
or automatically updating the schedule database, if a "forms" manager
was used to compose the project status request.) There is nothing too
different from classical management techniques, so far.  It is very
similar to an automated MBO system, and fits in nicely with other
project analysis/tracking methodologies (ie., like Pert Charts, Gant
Charts, Critical Path, etc.) Its only advantage, so far, is that it is
efficient-quick and requires minimal time/overhead from the group's
members.

The system differs slightly in that the original requests and
responses concerning project status are archived in a manner that they
can be retrieved, electronically, within a "context framework," in an
expedient fashion (perhaps by upper management, or the managers of
other related cross-functional departments who have a vested interest
in the status of the project.)  By this I mean that since all of the
project status reports are available on demand, (and can be collated
by a search of something like "Subject: project status something,")
the history of the project status from any previous date to the
current date can be put into "context" or "perspective," (by simply
reading the reports that were retrieved by the "context search.")
This is the part of the system that upper management would use to
evaluate the "state of affairs" in the project. (If the upper level
managers are "information junkies," the machines can be programmed to
issue the "context search" command, automatically, every night, so
that the managers know the exact status of every project, to the
day-every morning.)  What this really enables the managers to do is
"precision management" (a Texas Instruments'ism.) Again, this is not
too much different than using traditional management tools (just
significantly faster,) but note that this is not a traditional
bureaucratic process-in a traditional bureaucracy, the various
organizations that had a vested interest in the project would be
"re'd" with memos (usually on a monthly basis) of status reports
(which are written in the context of the project managers,) and spend
time attending staff meetings (to get a "better" perspective on the
project.)  However, in this system, if other managers have a vested
interest, they have to retrieve the information from the archive their
selves (and thus draw their own conclusions.)  It is probably a key
point that the nature of the bureaucracy has now been changed. It
would be appropriate to think of the electronic bureaucracy as one
that a manager has the option of subscribing/un-subscribing to
(dynamically) on what is considered an appropriate basis. And if a
manager's attention has been focused elsewhere, the progress (or lack
of) made in the interim can be retrieved at any convenient time.  Note
that the objectives are not to replace paper memos, or meetings.
These are still important instruments in the group's dynamics. The
objective is to offer a better means of information/knowledge
dissemination through the organization (both to and from the project
group,) so that meetings, etc. can be "quality time," as opposed to
wading through reams of "show and tell" project status reports.

Additionally, and this is another key point, if an issue comes up in
the project schedules, (slippage, etc.)  a manager can formulate
another "context search" for an additional retrieval, that addresses
the specific issues in question from the project status reports. Note
that if the use of the system is enforced, then there is no excuse for
project "execution failures," too many people would have knowledge of
the "program exceptions," and too many people would be documented of
having the knowledge. (The project may fail for other reasons, but
execution is not one of them-unless the managers are sloppy, and not
"watching the store," in which case their managers can formulate
appropriate "context searches" to check up how well the project is
being managed, and so on up the corporate hierarchy.) The important,
key point, is that everyone in the project's chain of command is
subject to accountability and scrutiny of the decisions they make.

As a side bar, note that it is not the objective to scrutinize a
decision itself, but how the decision was arrived at (and, hopefully,
all of that information is documented in the archive.) Was a decision
made with integrity? What was the "quality" of the decision? Were all
pertinent issues and interests addressed prior to making the decision?
Note that an external, and presumably unbiased (ie., without parochial
interests,) source could retrieve this information from the archive
(or have it retrieved for them) and form an opinion on the
appropriateness of the decisions, and decision making process (with or
without the knowledge of the people involved in the project.) This is
the traditional realm of OD or TQM organizations (who, presumably,
would monitor such things, "real-time," and keep upper management
appraised of effectiveness of the project's management-these
organizations have no responsibility or authority in what was decided,
but only in the way that it was decided.)  Note that this,
potentially, presents a means of enforcing the use of the system-the
system would be used as a significant source of information-by,
perhaps, a disinterested party-in the determination of promotions,
bonuses, etc. and fits in nice with the current concepts of "quality
management."  It is an important, key point, that the system fits in
nicely with the more contemporary management schemas. As the decision
making process is pushed down into a modern, flat organizational
structure (ie., "empowerment" is a fashionable term and "down sized"
is the reality,) this type of system is probably the only way that
management can "stay in touch" with the decision making process.  It
is also probably the only way that management can exercise any sort of
policy influence over the way the organization makes its decisions
(which will be well below upper management, hopefully.) As a
conclusion to this side bar, observe who is the audience of the
author(s) of the "library" (think about it, it is the other team
members and program monitors) which could conceivably evolve into a,
largely, self directed management structure-ie., an organization that
does the most, with the least, the quickest. (The keyword here is
agility.)

Probably the system's greatest benefit can be realized when the group
(or team) consists of cross-functional personnel. Here, as an example,
Marketing would be responsible for doing media searches for pertinent
data and opinions (and placing it in the project archive where it
would be available on demand to everyone with a vested interest in the
project.) Ditto for Sales, who would be responsible for making sure
that interests of the customer base was represented in the group's
decision making process.  And so on for the other functions
represented. Note that the document that defines the various
responsibilities of the functions should also be in the archive
(because someone may want to retrieve that piece of information,
someday.) The objective of the system is to integrate all of the
cross-functional issues in a concurrent fashion. There are no excuse
for engineering designing something that manufacturing can't make and
marketing can't market because sales can't sell it, because the
customer doesn't want it, or understand it.

Note that at this point, each of the cross-functional organizations
must go "on record" as committing to "this is what we want to do,"
(ie., an agenda has been determined.)  (This commitment is something
that upper level management will be monitoring the system for as a
necessary prerequisite for project sponsorship. They will probably
have to intervene, in the interest of expediency, to circumvent the
forestalling of the functional organization's parochial interests.) At
this point, we have an agenda, a buy-in from the functional
organizations, and a sponsor, and it is all documented and subject to
retrieval at a moments notice-which will be of use in the project's
execution phase when directions become de-focused and the functional
organizations renege on their commitments. (Like it or not, the
reality is that these two things happen all too frequently-I have
never been involved in a project where these two things did not
happen, to a more or lesser extent-sometimes with disastrous
consequences.)

I used the example above of schedule/milestone tracking, and how it
relates to the social infrastructure of the organization, and how it
can be monitored in an electronic "bureaucracy." By no means do I
imply that this is the only organizational process that needs to be
monitored in the archive. Another of the advantages of the proposed
system is to be able to relate organizational processes. For example,
resource "burn rate" should be monitored (time cards should be
electronic, also-which are an email into the archive containing the
number of hours spent, during the day, on a certain task.) Now the
"context frame" can be changed to include milestone progress, AND the
cost of development to obtain the milestone. And so on.

**********************************************************************

John Conover writes:
> Continuing on with the application of IT, I will present an example
> usage scenario in the day to day operation of the IT system outlined
> above.  To reiterate the way things are setup, all members of the
> group (or team) have a mail alias set up in their project account that
> includes their self, all other members of the group, and the group's
> project archive. ...
>

The previous applications involved coordinating project management
issues and execution that cross organizational boundaries.  The
concepts outlined are equally applicable to service organizations,
where "customer satisfaction" (whoever the customer may be) is an
important objective.

It is important to understand that email is different from other forms
of electronic communication in that it is a "self documenting" system.
Email is more like a "real" letter, (as opposed to to something like a
phone conversation, or voice mail message,) in that the recipient has
something tangible that can be kept, filed, collated, and distributed,
electronically.  Email is also different from facsimile (FAX) in that
email that has been saved (electronically) can be searched for subject
content, and retrieved based on an arbitrary search criteria (which,
obviously, can not be done with a FAX-the keyword here is "arbitrary"
since a file cabinet can be searched by looking at folder names-but it
is not an arbitrary search where retrieval is dependent on the
"content" of the letters contained in the folders.)  Email is also
different than paper, in that it automatically carries information
about how it was routed through the organization, and a "time stamp"
of when it entered and exited each of the organization's machines (it
is also assigned a unique identifying number when it is originated.)

Usually, email, ir-regardless of subject, is stored in a single
database. (The correct term is "full text database," but it is also
referred to as an "archive.")  It is a key point that all email that
pertain to a specific subject can be collated together, and retrieved,
by framing a search criteria. (The correct term here is "text
information retrieval," but it is also referred to as "querying," or
"electronic literature search.")

In service organizations, usually, the issue is that a lot of tasks
are being handled concurrently, by only a few people, and things "fall
in cracks," which are usually discovered in staff meetings. One
alternative, to aid task tracking, is to construct a database of all
of the organization's incoming email as follows:

        1) When an email enters the organization's network (either by
        being routed there by a manager, or direct from some other
        organization, perhaps outside the company,) the email is
        automatically put in the organization's database and assigned
        a docket number.  A copy of the email is also automatically
        routed to the organization's supervisor, who in turn, will
        dispatch the copy to the appropriate person for action. (This
        routing is all recorded in the email's "header.")

        2) When any action(s) are taken, the action(s) are recorded in
        an email, which is forwarded to interested parties (like the
        supervisor, for example,) and a copy, automatically, placed in
        the database. Any pending issues are also recorded-pending
        issues will hold the docket open.

        3) When all of the issues concerning the email have been
        resolved, an email stating such is placed in the database,
        which closes the docket.

Note that the status of all issues represented in the database can be
queried at any time (by the supervisor or manager, etc.,) and those
issues that have not been resolved, or still have pending issues can
be found and addressed. Reconciliation is accomplished by querying for
docket numbers that have been opened, and have not been closed.
Obviously, the supervisor would be doing this query on an operational
basis. Additionally, the query could be run every night
(automatically,) and the results forwarded to the manager every
morning for evaluation.

Note that email is never removed from the database, since it
constitutes a "history" of what the organization has accomplished (and
otherwise.) And, of course, this history could be reviewed
(periodically) by the TQM/OD (or other non-biased) organization to
make a qualitative evaluation of the organization's effectiveness and
efficiency.

I would now like to discuss some of the contemporary moral issues of
using systems like the ones outlined above. A good question is: Isn't
this system a form of "Big Brother" watching? And in some sense, it
is. However, if you look at the question from another viewpoint: If
you were running an organization, and paying for the organization's
resources out of you own pocket, wouldn't you be watching? The two
viewpoints seem contradictory and incompatible.

These questions are part of the dilemma of the "electronification" of
society. They are debated ad infinitum on the Internet and constitute
only a fraction of the general issues involved with the integration of
computers and networks into society.  These are not easy questions,
and there are no easy answers.  Leading one side of the argument is
the "Electronic Freedom Foundation," founded by Mitch Kapor, (who
founded Lotus corporation, and wrote Lotus 123.) The EFF attitude is
very liberal-supporting privacy. On the other side are organizations
where everything is open and known (or available to be known) via
policy, by all in the organization. Certain government laboratories
(Lawrence Livermore, for example,) are run this way. Even everyone's
salary and benefits are published and available to everyone else.

In some sense, the systems constructed around email are different. Of
course things are tracked and performances evaluated.  (And that is
true of any management tool, like MBO, for example.) But on the other
hand, these systems are a kind of "electronic democracy."  For
example, in the system discussed above, (that would be useful in a
service organization,) if the personnel were overloaded, they would be
foolish not to write an email to the database declaring such. Note
that in doing so, they "went on public record" and "declared a stand"
on the issue. The capability to do this is a very democratic ideal,
(note that the supervisors/managers have no direct authority to remove
any data-this or otherwise.) Those who claimed that they were
overloaded with work (whether they were, or not) have the means at
their disposal to make the alleged conditions an issue within the
organization.

When you really consider what the email based systems do, and how they
operate, they are really not so different than the memo/letter based
systems used in commerce for centuries. What is different is the
efficiency of transmission and distribution, and the ability to
collate information from the database system on demand (by framing a
"context" query.) In a paper based system, you retrieve things based
on how it was stored-ie., by looking at the titles on the folders in a
file cabinet. Cross indexing letters and memos enhances the capability
to retrieve information (albeit with significant allocation of
resources to do it.) There is not so very much different in any of the
email systems discussed here, except that the process is automated,
inexpensive, and fast. (In case you are curious, these systems also
cross index-but they cross index every word in every letter/memo, in
every file-that is how the internals of the systems work-nothing
more.)

**********************************************************************

John Conover writes:
> The previous applications involved coordinating project management
> issues and execution that cross organizational boundaries.  The
> concepts outlined are equally applicable to service organizations,
> where "customer satisfaction" (whoever the customer may be) is an
> important objective.

The previous applications offered a "how to" "cook book" approach to
the integration of IT into the organizational decision making process.
A good question should be addressed, at this time, as to why one would
want to do so. To answer this question, I will offer a rather pompous
analytical derivation, and then discuss the conclusions, relating the
perspective to a typical organization, in (hopefully) a way that
conceptual conclusions can be drawn as to the applicability of IT to a
specific environment.

The specific environment that will be addressed is the electronic
systems design environment, since it is a straight forward procedure
to evaluate the "complexity" (ie., the amount) of intellectual work
that has to be accomplished to complete a design project.

As a side bar for the un-initiated, a "register" is an electronic
memory element that contains exactly 1 "bit" of information-like a RAM
memory element in a personal computer, or a data flip flop in an IC. A
"bit" is the basic unit measure of electronic information. In most
contemporary digital electronic systems, each "bit" has two mutually
exclusive "states," ie., it can be in "state" 1, or "state" 0,
(representing "true" and "false," respectively) but never both at the
same time, and nothing in between. So one "bit" of information can
contain two "states." Two "bits" of information can contain four
"states," since each of the first "bit's" two "states" can be
associated with each of the second "bit's" two "states," and so on.

Some Theory:

    Consider an electronic system.

    Let B = The number of registers (bits) in the system.

    Let S = The number of states in the system.

    Let I = The information in the system, (ie., Hartley Information.)

    Then from Information Theory:

    B = ln (S) / ln (2) = I.

    Or alternately (by solving for S):

    S = e^(B * ln (2)) = e^(I * ln (2)).

    Therefore, the number of states that the electronic system can
    assume is exponential on the number of registers (bits) in the
    system.

    Again, as a side bar for the un-initiated, you can understand the
    essence of the last equation by observing the following table that
    relates the number of bits in a system to the number of states
    that the system can have:

               BITS   STATES

                0       0
                1       2
                2       4
                3       8
                4       16
                5       32
                6       64

    It is a key concept that the number of states doubles every time
    the number of bits is increased by one-that is all that the last
    equation says.

Now the question can be formulated:

    Does the engineering resources required to design a system grow
    with the number of registers (bits) in the system, or the number
    of states in the system?

Some Discussion:

    Obviously, a 4 bit counter is more difficult to design that a 4
    bit data latch, because we have the 16 state transitions to
    consider in the counter.

    Consider 4 data latches. If the latches operate autonomously (as
    in latching data on a buss,) adding a 5'th latch would take about
    25% more design resources, if the latches were designed
    individually, (ie., the design resources would be linear on
    "complexity.")

    But, if the latches were not operated autonomously (as in a
    counter, where the next state of the latches is determined by the
    current state of the latches,) and we are required to analyze all
    possible state transitions (in addition to designing the latches,)
    then adding a 5'th latch would take 100% more design resources
    since we have 32 state transitions to consider instead of 16 (ie.,
    the design resources would be exponential on "complexity.")

I chose the electronic systems design environment as an example,
because it is well documented, and has been the subject of scrutiny of
many studies.  (The studies, in general, support the above theory.)
The next obvious question is: does the same thing happen in other
organizations? To answer the question, we re-phrase the above
theoretical analysis.

Some More Theory:

    Consider a memo based administrative system-each memo is assumed
    to contain subject matter on only one subject, and this subject
    matter can be about only one of the two issues that the
    organization is addressing.  (I realize that this is an
    oversimplification of a how a real organization works. However, it
    should be enlightening to consider that, even in such a simple
    model, the resources required for the organization to operate can
    be exponential on the number of memos in the administrative
    system-as opposed to linear, which is the intuitive deduction.)

    Let B = The number of active memos in the administrative system.

    Let S = The number of states in the administrative system. (If you
    want to conceptualize this, assume that the administrative system
    is in one state, and we then add one memo to the system-then
    certainly the situation has changed, ie., the system is in a
    different state.)

    Let I = The information in the administrative system.

    Then from Information Theory:

    B = ln (S) / ln (2) = I.

    Or alternately (by solving for S):

    S = e^(B * ln (2)) = e^(I * ln (2)).

    Therefore, the number of states that the memo based administrative
    system can assume is exponential on the number of active memos in
    the system.

So, we can see that, apparently, the same thing that happens in
electronic systems design organizations, also, can happen in other
organizations as well.

The original question was:

    Does the engineering resources required to design a system grow
    with the number of registers (bits) in the system, or the number
    of states in the system?

Or to re-phrase the question:

    Does the engineering resources required to design an electronic
    system grow exponentially on the "complexity" of the design?

The answer is, no. Not necessarily.

This is the fundamental premise of information theory. Information
theory states that although the number of states in the electronic
systems grows exponentially on the number of registers in the system,
there exists a methodology that will handle the situation with
resources that are linear on the number of registers in the system,
and not the number of states.

I will return to these "information theoretic concepts," but first I
would like to explore the way that the electronic systems design
organizations approached these issues.

In reality, electronic systems design organizations do not achieve the
theoretical lower limit of linear resources on the number of registers
in the system. The best come close. The not so good are closer to
exponential resources on the number of registers in the system.

What the electronic systems design organizations do is to automate the
design process using "Finite State Machine" (FSM) transition diagrams
(perhaps, among other things.) Then using Boolean algebra, the design
process can be made linear on the number of registers in the system-at
least in principle. (If you are a designer, try to imagine the
difficulty of designing a complex digital electronic system without
these two "tools" or concepts.) It is an important key concept that
the way that the issues of complexity are addressed is by automation,
(that is why it is called Design Automation, or DA for short.) It is
an equally important concept that the design automation did not
automate the design, per say, but automated the design process.  It is
a key subtle difference.

As an unrelated side bar, the key question to be asked when specifying
or purchasing design automation software is "does it automate design
or does it automate the design process?" If the design is complex,
then the answer had, obviously, better be the latter.

To this point, we have determined that information theory states that
a project can be completed with linear resources on the project's
complexity (if you know how to do it,) and the best organizations come
close to this limit. The not so good are closer to exponential
resources on the project's complexity (because they don't know how to
do it.) The question is how to avoid the latter. It is obviously a
question of "know how," which was termed "knowledge" and/or
"capability" in the previous applications. The important key point is
that this is an intellectual activity. This intellectual activity
requires knowledge of the organization's dynamic or strategic
"agenda," so that managers can influence changes, as appropriate. As
previously described, the way that we do this is to establish a system
that essentially automates the documentation of the "work flow"
through the organization. (Or more correctly, in an idealized concept,
it automates the "work flow" process.)

There is a problem, though. It would probably be reasonable to assume
that an organization that is requiring exponential resources on
complexity to accomplish its agenda, would also generate information
that is exponential on this complexity.

That is the inherent advantage of the system described in the previous
applications. It exhibits linear characteristics where information is
increasing at an exponential rate. I will illustrate how, with the
child's game of "20 questions." This will explain why "context framed
searches" of the organization's "work flow" documentation (ie., email)
are an effective alternative in this situation.

In the classic "20 questions" game, (which has kept countless children
occupied on long journeys,) the parent would say "I'm thinking of
something, you have 20 questions that you can ask me to find out what
it is." The child would then possibly reply, "is it in the car?" The
parent would answer appropriately. Then the child would narrow down
the search with another question, and so on. This kind of "search for
an answer" schema is a very powerful "tool."  The child's best
strategy is to "frame" the questions such that half of the
possibilities are eliminated with each question.  Technically, this is
what is known in computer jargon as a "binary search," (because you
are always dividing the possible answers in half with each question-or
alternatively, twice the number of possible answers can be handled by
adding only one more question.) It is also a schema that can be
readily automated by computing machinery.

To be more precise about it, with each question that the child asks,
more "context" is gained about what the parent is thinking about.  In
the above applications, this is what was called "context framework,"
or to quote from one of the applications outlined above:

    "The system differs slightly in that the original requests and
    responses concerning project status are archived in a manner that
    they can be retrieved, electronically, within a "context
    framework," in an expedient fashion (perhaps by upper management,
    or the managers of other related cross-functional departments who
    have a vested interest in the status of the project.)"

It is an important key concept, that as the amount of information in
the archive doubles, you have to increase the number of questions that
are asked by only one, (ie., as the amount of information grows
exponentially, the number of questions that need to be asked to arrive
at a specific document/conclusion grows linearly.)

Note that the key is being to be able to formulate a "context
framework." Or to put it simply, the question is "what question do I
ask?" Or again to quote:

    "Additionally, and this is another key point, if an issue comes up
    in the project schedules, (slippage, etc.)  a manager can
    formulate another "context search" for an additional retrieval,
    that addresses the specific issues in question from the project
    status reports."

The formal definition for this process is "relevance feedback," which
is an iterated search technique (similar to the child's strategy,
above.) There are several methodologies to do this, one is to query
the archive for the count of certain words or phrases, and start with
the most likely document, ie., the one with the most occurrences of
the word or phrase.  Another is what is called "proximity search,"
which will search for phrases and words that are "close" together in
specific documents.  Still another, is to print the text that
surrounds specific words or phrases from specific documents-this is
known as a "permuted index."

With this information, new queries can be formulated (possibly using a
"natural language" boolean query) to eliminate unlikely candidate
documents from the electronic literature search, and narrow down the
search until you finally arrive at a specific answer/document/concept.

And that is the theoretical reason why the system works-and why it is
applicable as a management tool in organizations that have to deal
with "complexity" issues.

**********************************************************************

John Conover writes:
> The previous applications offered a "how to" "cook book" approach to
> the integration of IT into the organizational decision making process.
> A good question should be addressed, at this time, as to why one would
> want to do so. To answer this question, I will offer a rather pompous
> analytical derivation, and then discuss the conclusions, relating the
> perspective to a typical organization, in (hopefully) a way that
> conceptual conclusions can be drawn as to the applicability of IT to a
> specific environment.

This is the research literature bibliography that was used in the
"IT uses" applications.

References relating to the global economy, the post industrial
revolution, and the information age.

1) "The end of History and the Last Man," Francis Fukuyama, Free
   Press, (a Division of Macmillan,) New York, New York, 1992.

   Mr. Fukuyama was the Deputy Director of Planning of the U.S. State
   Department's Policy Planning Staff. The book was authored at the
   Rand Corporation, and is an extension of the work done by Mr.
   Fukuyama while at U.S. department of State. This book is very
   difficult to read do to the convoluted presentation of the subject.
   It is rich in the empirical and theoretical directions of the world
   economy, and includes both in historical perspective. The world
   economy, the third world economy, and the U.S. contemporary economy
   are related in their geopolitical sense to capitalism and liberal
   democracy, and a future course of events is anticipated, based on
   the historical perspective. This book should be required reading
   for anyone responsible for strategic planning in a global economy.
   In general, to summarize the book, Mr. Fukuyama claims that the
   world economy is in a state of transition, as the third world
   countries enter the industrial revolution with their cheap labor
   rates (Japan has done it and succeeded, Tiawan is about to do it.)
   He further claims that the U.S. economy is in transition, as it
   leaves the industrial revolution, and enters the post-industrial-
   revolution (ie., the information age,) and joins the countries that
   have already done so, Germany, etc. In this new role, the majority
   of the economy would "add value to goods manufactured in other
   countries," and it is the point of the book, that this can only
   happen in a decentralized economy (ie., capitalism,) and liberal
   democracy. He cites the failure of the USSR, and Mainland China as
   examples of societies that can not break out of the industrial
   revolution, and move forward into a modern techno/informational
   society.

2) "Sunburst: The Ascent of Sun Microsystems," Mark Hall, John
   Barry, Contemporary Books, Chicago, Ill., 1990.

   This is the official history of Sun Microsystems. Of particular
   interest is the reasons for Sun's success in view of global
   economic agenda outlined by Mr. Fukuyama, above. This is a
   non-technical book, and enjoyable reading. It should be required
   for anyone responsible for strategic marketing in a global economy.
   There is a presentation of the personalities involved in the
   company, and the way that they cooperated to form and grow the
   enterprise. Of particular interest is Sun's concept of what
   computing in the present and future is all about. I include this
   reference because it is the history of a proto-type company that is
   exploiting situation of the post-industrial-revolution, ie., the
   information age, see Fukuyama, above. (Today, manufacturing
   constitutes about 10% of the GNP, services the rest, ref. U.S.
   Dept. Interior.)

3) "Hard Drive," James Wallace, Jim Erickson, Wiley, New York, New York,
   1992.

   This book is the official history of Microsoft. Of particular
   interest is how Microsoft was established to take advantage of the
   information age (this was Paul Allen's dream-he had to coerce Bill
   Gates at the time.) It is uncanny how consistent the success of
   Microsoft is with Mr. Fukuyama work, above. This book is enjoyable
   reading, and should be read by anyone responsible for strategic
   and/or tactical marketing in the information age. Of particular
   interest is Microsoft's concept of what computing in the present
   and future is all about-tactically, it is in contradiction with Sun
   Micro., above, but strategically they are the same. Ditto the last
   two sentences describing the Sun Microsystems entry above.

4) "Father Son & Co.," Thomas J. Watson Jr., Bantam Books, New York,
   New York, 1990.

   This book is the autobiography of the man that took IBM into the
   information age, from the mechanical tabulator age. A history of
   IBM, and how T. J. Watson Sr., built it is presented. A history of
   IBM sales/service is initiated with a detailed account of Mr. John
   Henry Patterson's sales techniques at NCR. (FYI Patterson invented
   the modern sales techniques used in the U.S. today-he is the
   founding father of salesmanship/service.) I include this reference,
   because it is the history of a company that exploited the situation
   at the end of the industrial revolution of a large society, see
   Fukuyama, above. (In IBM's hayday, manufacturing constituted about
   50% of the GNP, services the rest.)

5) "My Years with General Motors," Alfred P. Sloan, Doubleday, New York,
   New York, 1963,

   I include this as a reference solely because of the book's
   historical importance. (See the Forward by Peter Drucker.) This
   book is the recollections of the man that created the largest
   corporation in the world, and how he did it. Of particular interest
   is his views on management though policy and committee, his
   tribulation with financial interests and control, and the board-and
   how he handled them for 40 years. This book is interesting reading
   to anyone trying to understand the history of American management
   paradigms.  Ditto the last two sentences describing IBM above.

6) "The Virtual Corporation," William H. Davidow & Michael S. Malone,
   Harper Business, New York, New York, 1992.

   Good book on what modern business is like in the 1990's. Probably a
   good book on how the U.S. economy relates to the global economy and
   what American business has to do to survive. Explores the changes
   that must take place in management, organization, engineering and
   the market place in the new intellectual oriented businesses.
   Explores the value of Information Technology to make an intelligent
   enterprise.

7) "Reengineering the Corporation," Michael Hammer & James Champy,
   Harper Business, New York, New York, 1993.

   Good book on customer satisfaction and how to implement it to save
   operations expenses. Very critical of contemporary American
   business.  Claims that we are entering the 21'st century with an
   organizational concept that was designed in the 19'th century. Good
   documentation of successes the authors have had with the concept.
   Goes into detail on what "empowerment" means, and how it is in
   contradiction with the management concepts of the industrial
   revolution, which the authors claim still dominates American
   management.

8) "Intelligent Enterprise," James Brian Quin, Free Press, New York,
   New York, 1992.

   Good book on knowledge based services, and how they are supposedly
   revitalizing the economy. Author supports the view that knowledge
   and service based core competencies are the essence of the future.
   Fair analysis of the economic benefits of this strategy and how it
   can leverage market penetration and share. Analysis of Wal-Mart
   Merck, Honda, Apple, Boeing, etc. Good detail on radically new
   organizational structures, e.g., inverted, starburst, spiderweb,
   etc. Probably a book to read.

9) "No Excuses Management," T. J. Rodgers, William Taylor, and Rick
   Foreman, Currency Doubleday, New York, New York.

   Good reading if you enjoy T. J. Rodgers philosophy's. Explains
   novel uses of email as a tool for project management. Probably an
   important book with valuable insight if you are in charge of an
   organization that is stuck on high center. Probably practical
   management insight.

10) "Computer Augmented Teamwork," Edited by Robert P. Bostrom,
   Richard T. Watson, Susan T. Kinney, Van Nostrand Reinhold, New York,
   New York, 1992.

   Excellent book on how to make teamwork happen over a network. Very
   authoritive. Contains the Internet addresses of the "who's who" of
   IT. Good on details of implementation. All contributors are from
   the field of team technology. Offers descriptions of commercial
   products available. Good lay descriptions of technical attributes
   of group/team software.

References relating to group dynamics, management, and sociology of
the modern enterprise.

1) "Developing Products in Half the Time," Preston G. Smith, Donald G.
   Reinertsen, Van Nostrand Reinhold, New York, New York, 1991.

   Good book on how to organize and manage an engineering group to
   expedite the concept-to-market cycle. Explains the benefits of
   doing this, but doesn't explain why one would want to. Suggests
   concurrent design methodologies in the appendix, but says that the
   customer should be involved in the conceptual stage. Starting on
   page 134, outlines structure alternatives-this part is worth
   studying.  (We have a copy with highlighted context for quick
   reference.)

2) "Computer-Supported Cooperative Work: A Book of Readings," edited by
   Irene Greif, Morgan Kaufmann Publishers, San Mateo, Ca., 1988.

   A dated book, but worth reading some sections. Primarily a
   justification for the way Lotus set up its development
   organization. This is probably the first book to use the term
   "groupware," (page 9,) and has some implications for working
   together at a distance (section 9.) Some of the sections are on
   (administrative) office procedures. Section 9 addresses the social
   implications of "groupware," and is rather cursory. Section 20 is
   on the implications concerning organizations and management-very
   well done. Section 21 is specifically addressing the organization
   and its value added information technologies to the global marked
   situation-probably the first time this was addressed in any tech.
   pub.-required reading and is still current. Section 25 is on social
   context of electronic communications-so-so, but probably worth the
   time spent to read it.  (We have a copy with highlighted context
   for quick reference.)

3) "In the Age of the Smart Machine," Shoshana Zuboff, Basic Books, Inc.,
   Publishers, New York, New York, 1984.

   A book that is still important. Shoshana is an Associate Professor
   at Harvard, with credentials in sociology and business. A truly
   excellent, (and boring) book describing the pitfalls of the
   application of technology through history, and projects, from the
   historical prospective, what information technology is going to do
   to society. Particularly interesting on the sociological
   implications of the industrial revolution on organizations.
   Probably the first book to mention that there is something
   following the industrial revolution, (the "post-industrial
   revolution," fancy that, ie., the information revolution.) Should
   be required reading for anyone with decision responsibilities in
   the information age-strictly non-technical.

4) "Intellectual Teamwork: Social and Technological Foundations of
   Cooperative Work," Edited by Jolene Galegher, Rober E. Kraut, Carmen
   Egido, Lawrence Erlbaum Associates, Hillsdale, New Jersey, 1990.

   Good book on social processes, group dynamics, and organizational
   dynamics in the information age. Non-technical, addressing the
   social sciences-excellent empirical studies and bibliography.
   Suggests ways to measure the effectiveness of information and
   inter-computing.

5) "5th Generation Management: Integrating Enterprises through Human
   Networking," Charles M. Savage, Digital Press, 1990.

   Tom Peters considers this to be the "Book of the Year."  Good book
   on what's wrong with management. Kind of a "hippie" book, (ie.,
   "down with everything.") Truly excellent on the evolution of the
   steep hierarchies in the industrial revolution (chapter 8,) and why
   they don't (and in his opinion, will never) work. This is basically
   a good book, (and is required reading for all of my staff.) My
   problem is it is a very negative book, that harps on what's wrong,
   and then harps on the way things should be (probably with some
   validity)-but does not address how to get there. Gives an example
   of a corporation that has spent much resources on an MIS system
   that is inadequate for the corporation's needs. Outright calls the
   organization an MIS state, run by an information czar.  (We have a
   copy with highlighted context for quick reference.)

6) "Enterprise Networking: Working Together Apart," Ray Grenier,
   George Metes, Digital Press, 1992.

   Truly excellent. Required reading for all of my staff. A very
   practical book, that is well written and informative on the social,
   organizational and technical aspects of competing in a global
   economy. Explains why one would want to do it that way, what would
   happen if you don't, and the way to invoke change to get there.
   Based on the quarter century of experience of Douglas C. Englebart
   in doing it. (FYI, Doug is credited with the invention of 1) the
   mouse, 2) the personal computer-the MAC and PC are copies that the
   courts have said cannot be patented because of previous work done
   by Doug, 3) windows-ditto, 4) pull down menus-ditto, 5)
   hypertext-ditto.) Good section (chapter 14) on benchmarking the
   organization. Good section (chapter 13) on the effect of
   information technology on quality. Very big on capability
   environments, and their application to the global competitive
   situation. The chapter on implementation, (referencing Englebart
   himself, and the various engineering groups he has had
   responsibility for,) is truly a masterpiece of simplicity.  This
   book is the Bible of concurrent engineering, distributed
   information networks, inter-computing, and simultaneous distributed
   work-the thing that ties marketing, sales, engineering, etc.
   together into a coherent organization.  (We have a copy with
   highlighted context for quick reference.)

7) "Home Work," Phillip E. Mahfood, Probus Publishing Co., Chicago,
   Illinois, 1992.

   A book that interested us in using tele-computing from home to form
   a distributed company. (Rumor has it that because of the
   environmental issues, California will be going to a 3 or 4 day work
   week within the next 5 years, 2 years in L.A.-we were curious as to
   how to manage the situation, and if tele-computing was applicable.)
   Of interest is the so-so success of tele-computing in Europe (they
   are ahead of us these developments.) Of particular interest is the
   European "Work-O-Tels" that address the same issues, but avoid the
   pit falls of the implementations that we are experimenting with.
   Excellent on the legal implications of tele- computing, and its
   global implications. Particularly good on who, and why (and who
   not) to select to work from home.  (We have a copy with highlighted
   context for quick reference.)

8) "Leadership and the Computer," Mary E. Boone, Prima Publishing,
   Rocklin, Ca., 1991.

   Good book on managing by information, and how to exercise
   leadership in the information age. Many case studies involving
   sophisticated information systems, and simple ones. Many interviews
   with CEO's from the fortune 500 list. Not necessarily a "how to"
   book, but does outline the way others have done it. Good reading
   for upper level management in a modern company.

9) "Connections," Lee Sproull, Sara Kiesler, MIT press, Cambridge,
   Massachusetts, 1991.

   Good book on the sociological implications of inter-computing, and
   how to avoid the pit falls (required reading for my staff.) Details
   of electronic group dynamics (chapter 4) is very good. That
   technology (such as inter-computing) always come as a two edged
   sword, with good and bad aspects is well presented in chapter 1.
   Particularly well written chapter on control and influence (chapter
   6.) Explanation of why one would want more than just efficiency is
   also well presented (chapter 2.) Implementational details and
   strategy is particularly weak.  (We have a copy with highlighted
   context for quick reference.)

10) "The Corporation of the 1990's," Edited by Michael S. Scott
   Morton, Oxford University Press, New York, New York, 1991.

   Excellent book on using electronic media for collaborative
   research.  Author is Dean of the MIT Sloan School of Management.
   Excellent book on history and uses of IT. Excellent on the
   organizational changes that must accompany the integration of IT
   into the contemporary organization. Well researched. Easy to read.
   (We have a copy with highlighted context for quick reference.)

11) "Paradigm Shift," Don Tapscott, Art Caston, McGraw Hill, New York,
    New York, 1993.

    Another good book on IT. Good for the non-technical. Well
    researched with good details of the various studies of integrating
    IT into an organization. Discusses the work-group concept and how
    technology can be applied to increase productivity and how IT can
    be used to "integrate the organization." Good book for the
    un-initiated on terminology and concepts of IT.

12) "The TeamNet Factor," Jessica Lipnack & Jeffrey Stamps, Oliver
    Publications, Essix Junction, VT., 1993.

    Good book on establishing teams that are networked together across
    functional organizations. Good on implementation. Studies are
    cited from Europe and U.S., both large and small companies.
    Probably should be required reading for modern managers that have
    to manage through networked technology.

13) "A Model for Distributed Campus Computing," George A. Champine,
   Digital Press, 1991.

   I reference this book, because it is the conceptual model of the
   "Internet." (FYI the "Internet" is one of the technological marvels
   of the 20th century-it is a high speed network that links over 2
   million computers, and an estimated 28 million people, together
   with a high speed WAN-10 meg/sec.-and extends from Europe, through
   the Americas, and to the Pac. Rim. Computer resources are shared
   across the network. It is funded by mandate from the U.S. Congress,
   and administered by the NSF, after being developed by DARPA.)  Of
   particular interest is the authentication procedures used, as
   defined in the project, Athena. This book is rather academic in
   nature, and probably of no interest to anyone in management.

    I include this book because of its historical perspective. It
    re-publishes many of the original works of Vanavar Bush. (Fredric
    Terman was one of Vanavar Bush's students at MIT-Bush had the
    notion that industry and academia should team up, and pressed the
    issue with Terman. Terman is the "Founder" of Silicon Valley.)
    The book's historical value is that it was Bush that first
    proposed (in the mid 1940's) that a computer could be used to
    manipulate a full text database system. The proposed system was
    called Memex, which evolved into the Hypertext system that is
    available on the Apple/MAC. It is an important historical
    perspective.

15) "Engineering Information Management Systems," John Stark, Van
   Nostrand Reinhold, New York, New York, 1992.

   Excellent book on the technical details of specifying a concurrent
   engineering support database. The issues addressed are not, by any
   means, trivial. Book was written in Switzerland, where most of the
   commercial software that addresses concurrent engineering is
   written. Good reading if you are designing a engineering MIS
   system.  (We have a copy with highlighted context for quick
   reference.)

Detail references relating the theoretical aspects of the above
listings.

1) "Games and Decisions," R. Duncan Luce, Howard Raiffa, John Wiley and
   Sons, New York, New York, 1957.

   The classic (it is back in print, by the way,) on game theory, and
   optimization techniques as used in the social sciences, by the two
   Rand Corporation theorist that worked under Von Neumann when
   developing the science. The book is a critical survey, on where and
   when such techniques can be applied effectively. It is not a book
   of Zealotry, and outlines (liberally) the limitations of the
   science. It specifically states that game theory may be of little
   use to the military strategist, but may become important as a tool
   for the social sciences. It analyzes democracy and tyranny, (and
   BTW the axiomatization of committee decisions is intriguing.) The
   book is well written, and the accompanying description of the
   rather pompous mathematics is easily read by the non-technical
   person. The math is involved. Complete grasp of the Linear
   Algebras, mathematical programming, and the calculus is mandatory.
   Conceptual grasps of set theory, and the principles of
   axiomatization is required for an in-depth study of the book. The
   book starts with the classics, zero-sum games, the prisoner's
   dilemma, etc., and concludes with the multiple player non-zero-sum
   games, and the axiomatization of group decisions, etc. Many of the
   axiomatization principles used were commissioned (to the Rand
   Corporation,) from the Dept. of State, and the DOD, in the 1950's
   and 1960's.

2) "Mathematical Methods of Operations Research," Thomas L. Saaty, Dover
   Publications, New York, New York, 1959.

   This book, also a classic, (also back in print,) is almost a
   companion book to Luce (above.) It is a numerical methods book on
   the implementation of the algorithms outlined in Luce, with some
   additional work on queueing theory. A book for the serious person
   involved in operations research-it would take a determined person
   to wade through the pomposity of the mathematics involved.

3) "Introduction to Minimax," V.F. Demyanov and V.N. Malozernov, Keter
   Publishing House, Jerusalem Ltd., 1974.

   I mention this classic reference because it lays the ground work
   for the book by Luce, et. al. It is an extension of Von Neumann's
   original work with the economist Morgenstern in the late 1930's
   which axiomatized the economic principles as we understand them
   today (or don't, depending on your point of view.) This book has
   made many good engineers out of not so determined mathematicians.
   It was a for runner to Rene Thom's catastrophe theory, (out of
   Institut desHautes Etudes Scientiques, near Paris, France,) which
   is highly regarded as new way of analyzing things economic., etc.

4) "Mathematical Programming and Games," Edward L. Kaplan, John Wiley
   and Sons, 1982.

   A more recent book on the above theoretical topics, that has a
   great explanation of the economic dual in linear programming.
   Duality is the relationship between optimal production and marginal
   values of the things that go into the optimal production, and this
   books spends a lot of time on this important issue, the above
   references are inadequate (at least in my humble and inconsequential
   opinion.)

5) "Introduction to Operations Research," Frederick S. Hillier, Gerald
   J. Lieberman, McGraw-Hill, New York, New York, 1990.

   This is the text to the course at Stanford University, and comes
   complete with a disk of programs so that one can play with the
   science. (Stanford is very big on OR, as a matter of fact, it was
   invented there in the late 1940's under contract to the DOD, by
   George Dantzig, who is still on staff.) I reference this book not
   because of its technical value, but because it is more modern than
   those listed above, and it is a fairly complete compendium on the
   the science. It is, however, not very rigorous.

6) "Searching for Certainty," John L. Casti, William Morrow, New York,
   New York, 1990.

   A truly great book. It is probably a fair appraisal of the
   capability of mathematical science to predict things-and why
   mathematics works at all. Casti is an ex-patriot of the Rand
   Corporation, and following technology to Europe, was one of the
   first staff members of the International Institute of Applied
   Systems Analysis (IIASA) in Vienna, Austria. He is now on the
   faculty of the Technical University of Vienna.  This book is not
   only readable, but also entertaining-it is a must for any
   scientific zealot to put 20th century science into perspective.  It
   is easy reading and interesting, presenting current scientific
   thought in the historic perspective of science's successes and
   failures in the last half of this century. In a previous book,
   "Alternate Realities," (John Wiley and Sons, New York, New York,
   1989,) he teaches (this is a text book by the way, and more
   technical than his current book) the correct way to axiomatize
   (ie., model) the systems listed above in this section. It turns
   out, that this is a non-trivial exercise, and accounts for many of
   the failures (of things like the theory of chaos, etc.) A must for
   anyone modeling organizations, etc.

7) "Information-Theoretic Incompleteness," G. J. Chaitin, World Scientific,
   New Jersey, 1992.

   Probably one of the most important books of the 20'th century. The
   implications of this work are still not generally understood. It is
   easy to read, and the author spends a significant part of the book
   explaining the issues to a lay audience. The formal sections are
   not for the un-initiated, by any means.

8) "Fuzzy Sets, Uncertainty, and Information," George J. Klir and Tina
   A. Folger, Printice Hall, Englewood Cliffs, New Jersey, 1988.

   Excellent book on the application of information theory. The author
   is quite distinguished in the field, and the text is interesting.
   Again, like 7) and 6) above, the limits of science (at least as we
   understand them today) are investigated. If you want to explore
   what information theory is "all about," this is an excellent
   choice. Very practical, and well written.

**********************************************************************

@book{Fukuyama,
    address = "New York, New York",
    author = "Francis Fukuyama",
    publisher = "Free Press",
    title = "The End of History and the Last Man",
    year = 1992}
@book{Hall:STAOSM,
    address = "Chicago, Illinois",
    author = "Mark Hall and John Barry",
    publisher = "Contemporary",
    title = "Sunburst: The Ascent of Sun Microsystems",
    year = 1990}
@book{Wallace,
    address = "New York, New York",
    author = "James Wallace and Jim Erickson",
    publisher = "John Wiley & Sons",
    title = "Hard Drive",
    year = 1992}
@book{Watson,
    address = "New York, New York",
    author = "Thomas J. Watson Jr.",
    publisher = "Bantam Books",
    title = "Father Son & Co.",
    year = 1990}
@book{Sloan,
    address = "New York, New York",
    author = "Alfred P. Sloan",
    publisher = "Doubleday",
    title = "My Years with General Motors",
    year = 1963}
@book{Davidow,
    address = "New York, New York",
    author = "William H. Davidow and Michael S. Malone",
    publisher = "Harper Business",
    title = "The Virtual Corporation",
    year = 1992}
@book{Hammer,
    address = "New York, New York",
    author = "Michael Hammer and James Champy",
    publisher = "Harper Business",
    title = "Reengineering the Corporation",
    year = 1993}
@book{Quin,
    address = "New York, New York",
    author = "James Brian Quin",
    publisher = "Free Press",
    title = "Intelligent Enterprise",
    year = 1992}
@book{Rodgers,
    address = "New York, New York",
    author = "T. J. Rodgers and William Taylor and Rick Foreman",
    publisher = "Currency Doubleday",
    title = "No Excuses Management",
    year = 1993}
@book{Bostrom,
    address = "New York, New York",
    editor = "Robert P. Bostrom, Richard T. Watson and Susan T. Kinney",
    publisher = "Van Nostrand Reinhold",
    title = "Computer Augmented Teamwork",
    year = 1992}
@book{Smith,
    address = "New York, New York",
    author = "Preston G. Smith and Donald G. Reinertsen",
    publisher = "Van Nostrand Reinhold",
    title = "Developing Products in Half the Time",
    year = 1991}
@book{Greif,
    address = "San Mateo, California",
    editor = "Irene Greif",
    publisher = "Morgan Kaufmann Publishers",
    title = "Computer-Supported Cooperative Work: A Book of Readings",
    year = 1988}
@book{Zuboff,
    address = "New York, New York",
    author = "Shoshana Zuboff",
    publisher = "Basic Books",
    title = "In the Age of the Smart Machine",
    year = 1984}
@book{Galegher,
    address = "Hillsdale, New Jersey",
    editor = "Jolene Galegher, Robert E. Kraut and Carmen Egido",
    publisher = "Lawrence Erlbaum Associates",
    title = "Intellectual Teamwork: Social and Technological Foundations of Cooperative Work",
    year = 1990}
@book{Savage,
    address = "Bedford, Massachusetts",
    author = "Charles M. Savage",
    publisher = "Digital Press",
    title = "5th Generation Management: Integrating Enterprises Through Human Networking",
    year = 1990}
@book{Grenier,
    address = "Bedford, Massachusetts",
    author = "Ray Grenier and George Metes",
    publisher = "Digital Press",
    title = "Enterprise Networking: Working Together Apart",
    year = 1992}
@book{Mahfood,
    address = "Chicago, Illinois",
    author = "Phillip E. Mahfood",
    publisher = "Probus Publishing Co.",
    title = "Home Work",
    year = 1992}
@book{Boone,
    address = "Rocklin, California",
    author = "Mary E. Boone",
    publisher = "Prima Publishing",
    title = "Leadership and the Computer",
    year = 1991}
@book{Sproull,
    address = "Cambridge, Massachusetts",
    author = "Lee Sproull and Sara Kiesler",
    publisher = "MIT Press",
    title = "Connections",
    year = 1991}
@book{Morton,
    address = "New York, New York",
    editor = "Michael S. Scott Morton",
    publisher = "Oxford University Press",
    title = "The Corporation of the 1990's",
    year = 1991}
@book{Tapscott,
    address = "New York, New York",
    author = "Don Tapscott and Art Caston",
    publisher = "McGraw-Hill",
    title = "Paradigm Shift",
    year = 1993}
@book{Lipnack,
    address = "Essex Junction, Vermont",
    author = "Jessica Lipnack and Jeffrey Stamps",
    publisher = "Oliver Publications",
    title = "The TeamNet Factor",
    year = 1993}
@book{Champine,
    address = "Bedford, Massachusetts",
    author = "George A. Champine",
    publisher = "Digital Press",
    title = "A Model for Distributed Campus Computing",
    year = 1991}
@book{Nyce,
    address = "New York, New York",
    editor = "James M. Nyce and Paul Kahn",
    publisher = "Academic Press",

Copyright © 1993 John Conover, john@email.johncon.com. All Rights Reserved.
Last modified: Fri Mar 26 18:58:46 PST 1999 $Id: 931120030106.2088.html,v 1.0 2001/11/17 23:05:50 conover Exp $
Valid HTML 4.0!