Errol Morris’s latest essay is both a moving tribute to his late brother Noel (co-author of one of the earliest email programs) and a fascinating history of the transition of computing at MIT from batch processing to time-sharing. One remarkable footnote stands out:
Marvin Minsky, one of the early members of Project Mac and director of its AI group, provides an account of an early meeting about time-sharing at IBM. IBM was committed to batch processing. It was part of their business model.
In fact, we went to visit IBM about using a computer with multiple terminals. And the research director at IBM thought that was a really bad idea. We explained the idea, which is that each time somebody presses a key on a terminal it would interrupt the program that the computer was running and jump over to switch over to the program that was not running for this particular person. And if you had 10 people typing on these terminals at five or 10 characters a second that would mean the poor computer was being interrupted 100 times per second to switch programs. And this research director said, ‘Well why would you want to do that?’ We would say, ‘Well it takes six months to develop a program because you run a batch and then it doesn’t work. And you get the results back and you see it stopped at instruction 94. And you figure out why. And then you punch a new deck of cards and put it in and the next day you try again. Whereas with time-sharing you could correct it—you could change this instruction right now and try it again. And so in one day you could do 50 of these instead of 100 days.’ And he said, ‘Well that’s terrible. Why don’t people just think more carefully and write the program so they’re not full of bugs?’
As Upton Sinclair once remarked, ‘It is difficult to get a man to understand something, when his salary depends upon his not understanding it.’ What is astounding, though, is that this scarcity mentality—the idea that the solution to the ‘Software Crisis’ was to reduce the load on computers by having programmers act more like machines—somehow maintained currency for decades, when those whose salaries did not depend on it considered it risible at least as early as the mid 1960s. The SEI’s Watts S. Humphrey, himself a former IBM manager of the period, was still pushing the exact same line in his dreadful 1995 book A Discipline for Software Engineering (p. 238):
When you find engineers who regularly produce programs that run correctly the first time, ask them how they do it. You will find that they take pride in the quality of their products. They carefully review their programs before they first compile or test them. If you want a quality product, you must spend time to personally engineer it, review it, and rework it until you are satisfied with its quality. Only then is it ready for compiling and testing.
The principle that using tools is cheating is, of course, not a hallmark of any established engineering field. Any implication that those who do so are not ‘[taking] pride in the quality of their products’ is an insult to all practicing engineers. The true engineering approach would be to understand the limitations of any tools and to use them appropriately to help minimise the consumption of scarce resources.