Air University Review, January-February 1967
Major William M. Henderson
The word “analysis” in recent years has become a status symbol in the services. A commander feels naked without a special staff group to do analysis. No self-respecting staff officer would write more than a paragraph without using the word at least twice—especially when replying to a proposal from a subordinate organization—“. . .a thorough analysis of your proposal reveals that it is neither practical nor feasible. . .”
Every year hundreds of officers in the professional military schools write theses that use our favorite word in the title. Everything “An Analysis of Civil-Military Relations Defense Decisions” to “An Analysis United States Army’s Enlisted Assignment System.” Yet how many of them—and how many of you—could adequately define what is meant by “analysis”? Must this word forever suggest an elusive, ephemeral concept? I think not.
And yet, if you have done any reading lately in the literature of analysis—management science, decision theory, operations research, stems analysis, etc.—you may have concluded, as I have, that like most specialists the analysts have difficulty communicating with laymen. Of course some of the recent articles in the Air University Review on this subject have been notable exceptions.
The purpose of this article is to propose a structure for commanders and staff officers to use to evaluate the analysis capability available to them. The mystique which surrounds the analysis function is born of the multitude of charts, graphs, formulas, and computer models that seem to equate to “scientific” analysis. These complexities conceal more than they reveal about the quality of an analysis. For analysis—I almost said “in the final analysis”! —is nothing more than thinking very carefully about something. The commander or staff officer is going to have the greatest success using the analysis capability available to him if he recognizes it as an extension of, not a substitute for, his own thinking.
For all the controversy it has aroused in recent years, analysis is as innocuous as motherhood. If we rule out clairvoyance and slavish adherence to the Holy Writ of published directives, the only process left for arriving at decisions is—you guessed it—analysis. For analysis includes everything from the simplest application of common sense— “Should I take my raincoat today?”—to the most complex mathematical formulation imaginable. Indeed, it is really quite ridiculous to argue for or against analysis. The only argument—and it is an essential one—is over the proper degree of precision. One of the toughest decisions for the conscientious manager is determining an appropriate degree of precision for each analysis. It is ridiculous to contemplate using a complex mathematical computer model of rain probabilities based on weather statistics to make the raincoat decision. But it would be just as ludicrous to contemplate deciding whether or not to produce the B-70 bomber by merely thinking about it for a few minutes.
Finding the appropriate depth in this bottomless sea of analysis is probably the most critical function of the manager-commander. As captain of the good ship Analysis, he should not abdicate this function to his crew of analysts. Rather, the decision should be a joint one. The captain can intelligently use the talents of his crew and evaluate how well they perform their tasks without being able to repair an engine or operate a sextant. Similarly, a manager can intelligently use the talents of his analysts without being proficient in the techniques of linear programming, multiple regression equations, or Markov chains. Naturally, the more he knows about these techniques, particularly their limitations, the better. What he must have, however, to evaluate the product of analysis properly is a clear picture of what characterizes good analysis.
For what it is worth, here is a layman’s guide to evaluating analysis. First, there seem to be four characteristics of any good analysis of a complex problem: (1) a systems approach, (2) use of an interdisciplinary group, (3) the scientific method, and (4) the explicit treatment of uncertainty.
the systems approach
In recent months interest has been widespread in attempts to apply the systems approach to analysis evolved by the aerospace industry and the military to social problems at all levels of government. Problems such as urban transportation, medical care, waste management, education, water pollution, and government information systems are being scrutinized by the systems analysis teams of the aerospace industry. A recent article in Aviation Week pointed out that one of the primary problems in this effort was the tremendous communications gap between engineers and state administrators: “Contributing to this gap is the lack of any agreed upon nomenclature and definitions for the process of systems engineering and its associated functions of systems analysis or operations research.” Let me try to bridge this communications gap by offering a definition.
What is a system? Try this definition on any “system” you know: A system is
a collection of elements defined for a specific purpose. That should fit your
circulatory system, the
But why a “systems” approach? The good analyst recognizes the necessity of carefully defining the set of things he is interested in and their interrelationships. When he draws a box around his “system,” its parameters are determined by his purpose. We often speak of “a system” as if it had an entity all its own. This is a mistake. Every system is a conceptual creation of man’s mind. For example, if I mention the Pacific Command intratheate airlift system, it undoubtedly conjures up some mental picture of what that system includes. If my purpose in defining that system, however, were to determine its maximum cargo-carrying capacity, I might include the aerial port facilities, crew manning, aircraft maintenance, aircraft characteristics, etc. If my purpose were to determine the optimum method of assigning missions, the definition of my “system” would be quite different: now the important elements would be the operations control centers, communications, command relationships, etc.
The systems approach suggests one other important concept: every system is a subsystem. This sounds like double talk, but it is not. My circulatory system can be a system for given purpose, yet it is a subsystem of my body, which is a subsystem of all Air Force officers, etc., ad infinitum.
A good analysis, then, starts with a careful definition of the system under consideration. To make them manageable, most systems defined must be limited in scope. A proper question for the manager to ask when evaluating an analysis is, “Have any critical elements been omitted?”
The interdisciplinary group
The old saw that two heads are better than one—unless they are on the same person—certainly holds true in analysis of a complex problem. In the Department of Defense today it is not uncommon to see a study group composed of a political scientist, an economist, a mathematician, a sociologist, and several military officers of varying backgrounds. Why is “such a diverse group desirable? Doesn’t this diversity make consensus more difficult? Let me take these questions one at a time.
First, to consider adequately all aspects of any broad question of national security, one needs a wide variety of knowledge. No single discipline, and certainly no single individual, can possibly have the necessary breadth and depth of expert knowledge in all these fields. But there is another even more important reason why almost any analysis—even those at the “working” level—will benefit from a diverse group: the difference in viewpoint. A military officer with a background in operations sees a problem quite differently from one with a background in personnel, or supply, or transportation. Each member of the group should realize that his viewpoint on the total problem, not just his specialty, is desired.
But what about achieving consensus among such a diverse group? That is a natural concern. In any bureaucracy, most groups are composed of individuals with vested interests and prejudices. The product of such groups is usually less than the sum of the parts—a watered-down compromise at the lowest common denominator of agreement. No wonder that the average military officer’s reaction to convening a group for any reason is “Boy, that’s all I need!”
The analytical group should be poles apart in mental attitude from this “committee action” approach. Conflict of ideas should be encouraged; it is the yeast that makes the group product rise above the sum of the parts. The manager must encourage, particularly in the group leader, an attitude that creates this stimulating atmosphere. At the same time the manager, the group leader, and the participants must be conditioned to expect considerable groping and several false starts. These are normal at the beginning of any analysis of a complex problem.
the scientific method
The term “scientific method” has become a synonym for an investigative technique involving painstaking measurement and experimentation. This is unfortunate. No scientist would claim that there is any standardized method by which he arrives at answers, but all would agree that there is a manner of thinking about a problem which separates scientific from nonscientific methods. What characterizes this way of thinking?
First, it is inquisitive. Nothing is taken for granted. All assumptions are carefully scrutinized. No unquestionably right sacred cows are admitted to exist. What most men commonly accept as immutable “truths” the scientist recognizes as human theories. As such, they are subject to the limitations in human ability to perceive, interpret, and report what we experience. Therefore they are always subject to modification.
When Einstein began his investigations into the nature of matter and energy,
he undoubtedly did not set out to disprove
Rudolf Flesch, in his very useful book How to Write, Speak and Think More Effectively, put it this way:
The scientist lives in a world where truth is unattainable, but where it is always possible to find errors in the long-settled or the obvious. You want to know whether some theory is really scientific? Try this simple test. If the thing is shot through with perhapses and maybes and hemming and hawing, it’s probably science; if it’s supposed to be the final answer, it is not.
Second, the scientific way of thinking is objective. The scientist respects only investigation that is bent on discovering true cause and effect. He has little patience for the specious joining of effects to causes that is practiced by the bigots, the fire-breathing zealots, and the advocates of whatever stripe.
Complete objectivity among military officers or any other group in our society is rather rare. The heroic image, the legendary figure, is the one who fights valiantly for his beliefs, come what may—the Billy Mitchells, the Curt LeMays. Indeed, the rewards for complete objectivity are not great. An Air Force general who advocated the Navy’s Polaris weapon system during the late 1950’s would have been putting his neck in the noose. How many Navy admirals championed the B-36 a decade earlier? Possibly this explains the impatience sometimes shown by military officers with the scientist who refuses to make categorical statements, to “take a stand” and defend it. The scientist has long been disciplined to distrust the doctrinaire and dogmatic answer.
None of this, however, suggests that the scientist and the military officer are in opposing camps. Indeed, most military officers have had considerable scientific training. The intent here is merely to suggest that we should expect to have difficulty in finding support in scientific analysis for some of our views that are born at least partially of such homely virtues as patriotism, pride in our service, and belief in air power.
Finally, the scientific method is reproducible. Development of a new scientific theory that stands the test of time is a painful process. Every scientist knows that he must ultimately expose his theory to the scientific community in some form. Therefore, he takes great pains to record his process of investigation, his assumptions, his measurements.
The measurement process, in this age of mechanized information, can easily be overdone. Several planeloads of paperwork were required to support the recent C-5A procurement decision. Like a college thesis, the quality of an analysis is not proportional volume. Indeed, the multitude of charts, graphs, and statistics often masks the really key elements of the analysis. Unless the decision-maker can reproduce the essential elements of the analysis, he cannot intelligently evaluate it.
The scientific method, then, is a way of thinking that has three characteristics: it is inquisitive, objective, and reproducible. But how scientific can we be in a business that involves as many unmeasurable and imponderable variables as national defense? Obviously, the limits to precision.
the explicit treatment of uncertainty
Most writers in this field seem to agree that the quality of an analysis is pretty well determined by the way in which it treat certainty. All statements about the future most statements about the past involve some uncertainty. A great mathematician, C. J. Keyser, once put it this way: “Absolute certainty is a privilege of uneducated minds—and fanatics!”
A lot of study has been devoted to the treatment of uncertainty since the turn of the century. The highly developed mathematical disciplines of probability theory and statistics are devoted to reducing as much uncertainty as possible to a calculable risk. The responsible analyst uses this body of mathematical knowledge to attempt to quantify the degree of confidence he has in his statements about the future. For example, let’s say the accident rate of the F-4C in TAC has been 1.92 accidents per 100,000 flying hours. The mathematician may be able to say by analyzing the data that he is 95 percent confident that the accident rate this year will be between 1.53 and 2.21. If the accident rate turns out to be 2.32 at end of the year, the decision-maker should seek out a cause other than pure chance. Here is where some knowledge of mathematic helpful. The utility of statistical data and probability estimates is great. However, they must be used and evaluated with a knowledge of the limits of their precision and the dangers of their misuse.
For the evaluation of analysis, the key question is, Has the real uncertainty been suppressed? In quantifying and evaluating that part of the uncertainty which can be calculated, has the analyst failed to point out other uncertainties that might change the entire picture? A good analysis will point out these remaining nonquantifiable uncertainties so that a responsible judgment can be made about them.
The really good analysis will have these four characteristics: a systems approach, an interdisciplinary group, use of the scientific method, and the explicit treatment of uncertainty. Maybe if you jot these four characteristics down on your mental blotter, they will provide a helpful check list to evaluate the analysis capability available to you. Better yet, they might help you to examine the way you perform analysis yourself every day.
Completed staff work—Beware!
It is almost an axiom that analysis will be useful for decision-making to the degree to which the sponsor of the analysis has been able to communicate his intent to his analysts. If the problem is worthy of analysis in depth, it is highly unlikely that the sponsor will have anything more than a vague outline of the nature of the problem at the outset. This presents no major problem as long as both the sponsor and his analysts recognize the need for continual communication between them during all phases of the analysis. This “feedback” is particularly vital during the period of time when the parameters of the study are being defined. What are the objectives? What are the critical variables? What are appropriate assumptions? What is an appropriate criterion for choosing among the alternatives?
Unfortunately, this type of feedback is somewhat foreign to many experienced military officers. Most of us have been schooled in an environment in which an officer’s effectiveness is roughly equated with his ability to work without detailed supervision. “Completed staff work” is held up as the ideal; only as a last resort is the superior to be consulted prior to submission of a completed action paper.
An enormous amount of resources can be wasted by this orientation to analysis. Analysis of the type we have been discussing is appropriate for any problem, but it is vital for questions of such complexity that no real “solution” is possible. The intent of analysis should be to explore the range of alternatives open to the decision-maker. Only by a process of successive approximations can a course of action be arrived at that comes close to accomplishing the purpose. Objectives, assumptions, and criteria in such studies are not self-evident. They require a good deal of preliminary study.
Despite all the hullabaloo about decision-making these days, decisions are easy to make if quality is not a primary concern. Where we sometimes get in trouble is in equating decisiveness with courage. Sometimes it takes a great deal more courage to insist that a sound decision cannot be made until adequate time and effort have been devoted to analysis.
The final portion of this article will suggest another way of looking at the analysis process that is useful in evaluating analysis. Some of the jargon of the professional analyst will be dissected, and plain English translations will be offered.
We have suggested that analysis is not the special province of a small group of eggheads. Indeed, every military officer performs “analysis” every day. Most of us lack access to professional analysts to help us structure our thought processes about complex problems, but structure is the key to understanding any problem.
“Professional analysts” from Plato and Aristotle to Hitch and McKean have suggested essentially the same structure for disciplining our approach to thinking about complex problems. I cannot hope to improve on their basic ideas, so I shall boldly plagiarize them.
My intent is to take this basic structure for analysis and show that it is not something vague and mystical but that it is appropriate, indeed necessary, as an everyday tool for all military officers. My basic thesis is that most of us are too willing to dismiss analysis as the concern solely of the military officer or professional civilian with the word “analyst” in his title. We are not critical enough of their work. This is dangerous. For by refusing to track through his study with him to the extent that we can be intelligently critical of it, we delegate to the analyst more power than he can or should wield in the affairs of men. The analyst can and should be expected to help us by providing structure for our thinking. But he cannot and should not be allowed to do our thinking for us.
One of the most serious responsibilities we have as professional military men is to create and maintain within the Air Force the capability for penetrating, objective analysis of possible future courses of action. Whatever energy we have left could not be better spent than in carefully evaluating the product of this analysis capability and putting the results of the analysis effort into practice. By recalling the characteristics of good analysis, the military executive should be able to tell when he encounters a quality product. Now let us look closer at the elements of the analysis itself.
Most professional analysts, in particular those with a RAND Corporation background such as Drs. Hitch, Enthoven, Quade, et al., seem to agree on five elements of analysis: (1) an objective or several objectives, (2) alternatives, (3) costs, (4) a model or several models, and (5) a criterion or several criteria. None of these writers suggests that any particular analysis will be structured in this format, with each of its elements neatly labeled. Quite the contrary. Since analysis is nothing more nor less than thinking very carefully about something, the results of analysis may be as varied as the process of human thought. The knowledge that has a bearing on military problems comes from a wide variety of sources—educational institutions, nonprofit research organizations, individual writings, civilian contractors, and various government institutions. No standardization in format of this knowledge can reasonably be expected. However, an understanding of the five elements of analysis will help you to restructure any analysis and ask the right questions. Let us look at each of the five elements of analysis individually.
In considering any problem, a necessary first step is to carefully define an objective, a goal, a desired level of accomplishment. For many military problems, this is extremely difficult. For example, let us say that a hypothetical study was made to determine whether or not to build mobile, land-based intercontinental ballistic missiles. The objective initially established was to “provide the most effective missile system to deter thermonuclear war.” A noble aim, indeed. But how does the analyst measure how well mobile missiles deter war? Deterrence is a state of mind. And what’s worse, it is a state of the enemy’s mind, not ours.
In this situation, the analyst can only do what you do when you buy a new car. He chooses a way of measuring the effectiveness of any missile system which he hopes will come close to measuring its deterrent value. For example, he might choose as a measure of effectiveness the number of surviving megatons on missile launchers after the enemy’s first strike. (Admittedly this would be a long jump down the abstraction ladder from deterrence.)
It is for the decision-maker to decide if there are qualitative considerations that must be weighed along with the quantitative method of measuring effectiveness chosen by the analyst.
What does this have to do with buying a new car? Actually, the thought process is quite similar. When you buy a car, it is probable that your objective—if you ever stopped to define it—would be something like “To provide a means of transportation that will satisfy me and my family.” After giving it some thought, you find it difficult to come up with any numerical index that adequately measure the degree of satisfaction. There are all sorts of statistics on horsepower, acceleration, displacement, wheelbase, leg room, and so on. But which of these features comes closest to describing satisfaction for you and your family? Only you can decide; and only the commander or staff officer can decide whether the analyst’s chosen way of measuring accomplishment of the objective was appropriate. A great dea1 of judgment is required for this choice, particularly for higher level, more abstract studies.
But probably even more important than choosing a decent measure of effectiveness is defining the proper objective. Almost any problem of any complexity requires a process of successive approximation to define an adequate objective. Impatience on the part of the analyst or the sponsor of the analysis with this imprecise process of probing for an appropriate objective will invariably result in an inadequate product. The manager who insists on asking answers instead of questions is going to be continually dismayed by the low quality of work he gets out of his analysis staff.
It is practically axiomatic that there are always alternative ways of accomplishing an objective. The function of an analysis is to explore alternatives systematically. If the alternatives explored are found to be inadequate, the good analyst invents new ones.
In our mobile missile study, the alternatives might be Minuteman missiles
(fixed location), Polaris-submarine-launched missiles, Titan II missiles,
mobile medium-range missiles based in
The point is that alternatives need not be direct substitutes. They are merely various ways of accomplishing the objective. This stage, the invention of alternatives, requires the greatest amount of creativity in the study process. Free-wheeling creative thinking—brainstorming, if you will—often produces dramatic new concepts.
Every alternative for accomplishing the objective involves the expenditure of some resources—money, materials, time. In recent years the science of determining and estimating costs has become quite sophisticated. However, there are certain fundamental ideas which are generally accepted, and familiarity with these concepts will help the commander or staff officer to evaluate the costing function performed by the analysts.
Cost streams. The first of these concepts is the idea of “cost streams,” sometimes referred to as “total life cycle system cost.” Either term suggests the same thing. The costs to be used in comparing alternatives should not be confined to one arbitrary period of time—a fiscal year, five fiscal years, or whatever. Comparisons should be based on the costs over the entire life cycle of the alternative. And they should be based not merely on total costs but on the cost stream, the way the costs are distributed over the life cycle. For example, car A is a very expensive alternative to car B when only the initial cost is considered, but if the total lifetime of the car is considered and costs of maintenance, operation, and depreciation are added, the picture changes radically: car B may have to be replaced every six years while car A may last for fifteen.
Sunk costs. Another concept which may seem self-evident but which is often ignored is the idea of “sunk” costs. In comparing an existing system to a proposed one, it is inappropriate to include the capital expense of the existing system. These are sunk costs. For example, cost of the B-52 and the Advanced Manned Strategic Aircraft (AMSA) cannot be meaningfully compared except by excluding those resources expended on the B-52 which can be used by the AMSA. For instance, the AMSA can undoubtedly use many of the same fixed facilities as the B-52—runways, hangars, shops, etc. These should not be charged to either system. Neither should the costs of research, development, test, engineering, or procurement of the B-52 be included in the B-52 system cost. These are sunk costs. In other words, they are spent whether or not we buy the AMSA.
In defense spending, these costs are extremely difficult to compute with any accuracy because of the multitude of alternatives available for spending resources. All that can be done, usually, is to take account of opportunity costs in a gross way when making choices. For example, purchase of the OV-10 aircraft in a particular fiscal year may mean that development of an anti-ICBM system may have to be delayed. Sometimes opportunity costs are a powerful argument for delaying a commitment until sufficient study of alternatives has shown that the opportunity costs are not excessive.
Cost effectiveness. Pure cost in terms of expenditure of resources is never a proper basis for making defense decisions. The cost must be related to the gain in effectiveness which those resources buy. This sometimes leads to a rather confusing paradox for the professional military man. The very effective development of our strategic retaliatory might in the 1950’s has brought us to the point where very large expenditures are required to achieve a small increase in retaliatory effectiveness. Consequently, new weapon systems in this area run into terrific headwinds in a national decision process based upon cost-effectiveness considerations. Usually there are other areas where the same resources will buy a much greater increase in effectiveness.
The science of determining costs and comparing the costs of alternatives has become highly developed in the last several years. But a great deal of managerial judgment is required to evaluate the relative cost effectiveness of various alternatives. As aids to the application of this judgment, models are extremely useful.
A model is a representation of reality. It is usually greatly simplified. It eliminates much of the complexity of reality. But if it includes key factors in their proper relationship, it can be very useful. In fact, complexity in a model, the harder it is to keep it representational of the real world. Also, the usefulness of a model is diminished if the decision-maker cannot appreciate its design. The most ingenious and useful models are those which avoid the complex mathematical formulations of queuing theory, linear programming, and advanced calculus and instead merely show essential relationships between key variables.
A model suggests many different mental images. Probably all of them are correct. For a model can take on a number of different forms—a picture, a map, an organization chart, a graph, a series of mathematical formulas, a computer program, a PERT chart, and so on. In our mobile missile example, the model would probably be a series of equations representing the various alternatives. Ideally, an operational test would be used to determine how close the model came to predicting reality. In some cases, such as a drawing-board missile system, an operational test is impossible. Then the judgment of the decision-maker becomes critical, for he must determine through the use of his best judgment how close the model comes to predicting what will happen in the real world. The only true measure of a model is its ability to predict.
The model-builder, quite humanly, will tend to be too uncritical of the output of his model. The decision-maker must encourage the analyst continually to check his model against the real world. If there is a conflict, the model must give way.
Let’s take a very simple but real-life example. One very complex war game being used as a model of a hypothetical limited-war situation contained an element (called a “payoff function” by the analysts) which purported to compare the effectiveness of the Red and Blue forces in terms of close-support sorties during the play of the war game. Thus the relative effectiveness of the various air forces available to the Blue commander could be measured, the analyst claimed, by subtracting the close-support sorties available to Red from the close-support sorties available to Blue during the war. Can you spot the fatal flaw in this logic?
It is simply this. Effectiveness of close-air-support sorties depends on their being available in sufficient numbers at the right time. Although the numbers of close-support sorties flown during a war may average out to some figure like 20 or 30 a day, the normal requirement is probably for several hundred a day for a few days at a time to support major ground operations. Without this consideration of the timely availability of large numbers of sorties, the “payoff function” would be worthless.
The process of building a model requires a great deal of patience. Like the analysis process in general, it requires successive approximations. The analyst may go several times around the circle of defining the key elements, deciding what is trivial, what is significant, what can be measured and what can not. This process often benefits greatly from feedback from the decision-maker or a subject-matter expert who is capable of picking out the fatal flaw in the logic of the model.
Most military officers are quite familiar with the word “criteria.” Virtually every attempt to codify the problem-solving process includes the use of criteria to aid in decision-making. Just to be sure that we are on firm ground, however, let me offer a definition: A criterion is a rule for choosing among alternatives.
Criteria may appear in many forms, but they are most useful in an analysis when they can be stated as a rule. The two most common forms of criteria give preference to either the alternative which maximizes accomplishment of the objective within a fixed budget or the alternative which obtains a specified objective or goal at least cost. Both these forms of criteria avoid the most common pitfalls in stating criteria. Without enumerating all the pitfalls, I will merely suggest that you try to reconstruct the criteria of any analysis in one of these two forms. However, if neither a fixed budget (expenditure of resources including men, money, materials, and time) nor a desired level of effectiveness has been established, the criterion is almost certainly worthless.
Let’s look again at your choice of a new car. Suppose you settled on acceleration from 0 to 60 mph as your measure of effectiveness. In order to complete your analysis, you would have to formulate a criterion based on this measurement. For example, a suitable criterion might be “to prefer the car which achieves the best acceleration from 0 to 60 mph and costs less than $4000.” We might say that this criterion is closed on one end.
Useless criteria are those which are closed on both ends (“to prefer the car which accelerates from 0 to 60 mph in less than 9 seconds and costs less than $4000”) or those which are completely open-ended (“to prefer the car which achieves the best acceleration at the least cost”). The first form is useless because it does not necessarily result in a clear-cut choice: there may be ten cars which satisfy this rule. The second form is useless because it requires an infinite search: the “best acceleration” is theoretically an infinitely short period of time, and the least cost is zero. Such criteria will not help in making a choice.
This discussion is designed to point up two truisms about selection of criteria. First, the formulation of an adequate criterion depends heavily on the value system of the individual. Second, most criteria based on nontrivial value systems contain elements that are not easily quantifiable.
A more realistic criterion for the choice of a new car would be “to prefer the car which has the best combination of luxury, ride, and performance and costs less than $4000.” Defining adequate quantitative measures of luxury, ride, and performance is difficult enough, but when we specify the “best” combination of these features we are obviously beyond the realm of numerical measurements.
Just this type of value judgment is required in establishing criteria for virtually all decisions in the Department of Defense. Unfortunately, many decision-makers are not expert in explicitly defining their criteria. They themselves are not aware many times of the value systems which they use.
Decision-maker in today’s complex military environment is confronted with a bewildering maze of graphs, charts, and mathematical mumbo jumbo. His most important function, however, is to provide critical evaluation to every analysis. He need not understand the mathematical mechanics used to derive the values, though it helps if he does. He cannot function effectively, however, without structure for the application of his judgment. Every analysis of a complex problem, however poorly organized it is, should have the five elements we have considered: objectives, alternatives, cost considerations, a model, and criteria. If these are not clearly identifiable in the report of the analysis, they provide the basis for questioning the analyst: Do we really have a problem? If so, have we properly identified it?
What were the objectives? What measure of effectiveness was used? How were costs computed? What criteria were used to choose among the alternatives?
Three simple guidelines will assist military commanders and staff officers in their trek through the analysis jungle:
· Know what
characterizes good analysis.
· If you are
going to have to make the decision, apply your judgment during the
analysis process, not just at decision-making time.
· Use your knowledge of the structure of analysis in general to evaluate each particular analysis.
Major William M. Henderson
The conclusions and opinions expressed in this
document are those of the author cultivated in the freedom of expression,
academic environment of
Home Page | Feedback? Email the Editor