A History of Decision Tables

Presenting decision procedures in a tabular form goes back at least to ancient Babylon, where the rules for performing multiplication of cuneiform numerals were baked in clay tablets for students. Tables are a quick and easy way for a human being to read, understand and execute a complex procedure.

Tabular forms for computer programming dates back to the late 1950’s when General Electric, the Sutherland Corporation, and the United States Air Force worked on a complex file maintenance project. Attempts at using flowcharts and traditional narratives failed to define the problem after more than six labor-years of effort. This was typical of most large projects at that time. Then, in 1958 four analysts using decision tables successfully defined the problem in less than four weeks. When the right tool was used the problem was solved almost immediately.

The Sutherland Corporation also used these tables for program specifications, leaving it to the programmers to translate the tables into code by hand. General Electric, realizing that flowcharts and narratives were completely inadequate for complex logic, then automated the method to directly generate source code from the tables.

Automating the tables also formalized the method and made programs to handle them available. The use of tables for code generation became popular in the early 1960’s and most of the research on optimal code generation algorithms was done then. The name Decision Tables quickly became the standard term for the method. The Canadian Standards Association issued a formal standard for decision tables in 1970 (Decision Tables, CSA Standard Z243.1-1970), which can be obtained through the American National Standards Institute (ANSI) in the United States.

Decision tables did not reach widespread use for several reasons. The early decision table software was written for mainframe computers and required a large amount of main storage. Programmers and analysts simply did not get computer time to do development work. The early decision table programs were written in either mainframe assembly languages or COBOL, so they could not easily be rewritten to run on early minicomputers (or later, to the first personal computers), even if the machines had had the required storage available at that time. The user interface was usually punch cards, keyed from printed forms and submitted as a batch job. The lack of interactive video terminals was also a major reason that there were no mainframe word processors or spreadsheets back then, too.

As you can imagine, it was very costly to run these early mainframe decision table packages, both in terms of programmer and computer time. Computing a decision table by hand is difficult work, much like trying to use a quill pen and ledger for corporate accounting. The result was that most programmers never worked with decision tables or decision table software unless they were developing mission critical systems for certain government agencies or the military.

But once again times have changed and decision tables have now evolved into logic processing, just as manual typing evolved into word processing. The changes in the environment which made this possible came in the form of better hardware, better software and better programming practices. Minicomputers, workstations and personal computers became big enough, fast enough and cheap enough to support applications which had formerly been done only on mainframes.

Interactive video terminals with good user interfaces replaced punch cards for input. Software for personal computers also became more powerful, so it became reasonable to develop serious applications on small machines. The small computer user changed from a hobbyist to a serious developer. Finally, structured analysis and design became popular, and developers began to focus on the underlying methodologies of creating quality software. These methods are all driven by finding correct program logic, with the assertion that it is possible to formally prove the correctness of a program.