Kuhnel's Law reads as such:
Kuhnel's Law of Estimating Government Computer Projects. Make your best guess and multiply it by two. Then double it.
Back in the late 1960's, I became a computer programmer, working for the Californian Department of Transportation (called then the Department of Public Works: Division of Highways). I wrote programs in COBOL, FORTRAN, and RPG for mostly financial systems. These were highly complex systems, mainly used to manage the huge sums of money that was being used to pay for the construction of the Interstate Highway Program.
I noted early on that nearly all estimates for programming or systems analysis were far too low. Projects nearly always took much longer to complete than originally estimated, Often they took far longer. Eventually it became apparent that the average time of "overrun" was about 4 times the original estimate. The average cost of overrun was the same, about 4 times. So to describe this phenomena I coined what I called "Kuhnel's Law", which is stated above. While I have no reason to believe this was unique to government, that is where I toiled, so that is what it referred to.
Of course the question that always comes up is "why" is this true. What causes projects, particularly computer projects, to in general exceed their estimates so badly. The answer to this question is steeped in complexity, but can basically be broken down into two fundamental components. The first is a failure to estimate all the work, and the second it to think of work as a set of parallel process, rather than a network or series of dependent processes.
The first problem is caused by a number of factors. There is a tendency to only focus on the big or predominate tasks. The clerical work, oversight, testing, re-testing, documentation, meetings, and multitude of work efforts that are required to produce a finished product are simply left out of the estimate. They often add up to far more than the big tasks. In addition we tend to leave out the extra effort due to errors in planning, improper implementation, or "Acts of God". In the real world, mistakes are made in design, a board is cut too short, or something happens that we did not plan on such a fire, earthquake, flood, and so on. Rarely is this allowed for in the original estimate.
The second problem seems to relate to a human tendency to think we can do everything at once. This, of course, is not true when the same person is required to do the work, or when the work tasks are dependent, i.e. you must complete task A before Task B and so on. In reality there is often a limited amount of resources available to work on the project, and complex projects are full of highly complex dependencies. In computer programming the classic dependency was between a "clean compile" and testing. Before a computer program could be tested it had to be syntactically error-free. Until it was, called a "clean compile", you could not test the program. No exception. No matter how badly you wished to begin the testing process you could not. do so. When the program was finally syntactically correct and it was first run using test data, it was the usual case that errors of logic would be discovered. So corrections would be made, possibly resulting in new errors in syntax, which of course, prevented further testing until they were corrected.. It was no wonder that complex programs sometimes took seemingly forever to complete.
So, in short, Kuhnel's Law turned out to be amazingly accurate, and for good reason. That is the way of the world of computers.
------- Ron Kuhnel
Back to Ron's Favorite Laws