% fortune -ae paul murphy

APL, COBOL, & Dijkstra

One of the issues that arose in the talkbacks to my blog on the Second IT Commandment (Thou Shalt Honor and Empower thy (Unix) Sysadmins) came from a throwaway comment, on my part, about the difficulty of teaching a COBOL programmer to use APL. That drew a response from Jorwell citing Dijkstra as an authority against APL:

Dijkstra on APL: "APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums"

Now, just so you know what we're talking about: APL - A Programming Language, was developed by Kenneth Iverson (Ph'd Mathematics) to implement a mathematical notation in a computing environment. His textbook: Algebra: an Algorithmic Treatment (Addison-Wesley, 1971) and the matching 1976 APL Press book, Calculus in a New Key by D. L. Orth, demonstrate that essentially all of elementary analysis (i.e. linear algrebra and the calculus) can be expressed in APL, demonstrated in APL, thought about in APL, and advanced using APL notation.

That doesn't mean it's easy to understand - like Perl it lends itself to throwaway code because it's often easier to reinvent something than debug it.

This thing, for example, finds the uniq elements in a vector and is marked as a throwaway because of what it doesn't do - examine its input:

Dijkstra was a great man, a leader in the field, but he had a tendency to shoot from the lip, and this was one of his more unfortunate slips along those lines. More to the point, there's reason to believe that he would have regretted the comment ten years later if he'd learnt something about APL in the interim.

The APL comment comes from a note he tossed off in June of 1975 in which he excoriates virtually every programming language then used. Here's pretty much the whole thing, direct from the University of Texas Dijkstra repository:

Sometimes we discover unpleasant truths. Whenever we do so, we are in difficulties: suppressing them is scientifically dishonest, so we must tell them, but telling them, however, will fire back on us. If the truths are sufficiently impalatable, our audience is psychically incapable of accepting them and we will be written off as totally unrealistic, hopelessly idealistic, dangerously revolutionary, foolishly gullible or what have you. (Besides that, telling such truths is a sure way of making oneself unpopular in many circles, and, as such, it is an act that, in general, is not without personal risks. Vide Galileo Galilei.....)

Computing Science seems to suffer severely from this conflict. On the whole, it remains silent and tries to escape this <1--more--> conflict by shifting its attention. (For instance: with respect to COBOL you can really do only one of two things: fight the disease or pretend that it does not exist. Most Computer Science Departments have opted for the latter easy way out.) But, Brethren, I ask you: is this honest? Is not our prolonged silence fretting away Computing Science's intellectual integrity? Are we decent by remaining silent? If not, how do we speak up?

To give you some idea of the scope of the problem I have listed a number of such truths. (Nearly all computing scientists I know well will agree without hesitation to nearly all of them. Yet we allow the world to behave as if we did not know them....)

Programming is one of the most difficult branches of applied mathematics; the poorer mathematicians had better remain pure mathematicians.

The easiest machine applications are the technical/scientific computations.

The tools we use have a profound (and devious!) influence on our thinking habits, and, therefore, on our thinking abilities.

FORTRAN --"the infantile disorder"--, by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.

PL/I --"the fatal disease"-- belongs more to the problem set than to the solution set.

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence.

APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums.

The problems of business administration in general and data base management in particular are much too difficult for people that think in IBMerese, compounded with sloppy English.

About the use of language: it is impossible to sharpen a pencil with a blunt axe. It is equally vain to try to do it with ten blunt axes instead.

Besides a mathematical inclination, an exceptionally good mastery of one's native tongue is the most vital asset of a competent programmer.

Many companies that have made themselves dependent on IBM-equipment (and in doing so have sold their soul to the devil) will collapse under the sheer weight of the unmastered complexity of their data processing systems.

Simplicity is prerequisite for reliability.

We can found no scientific discipline, nor a hearty profession on the technical mistakes of the Department of Defense and, mainly, one computer manufacturer.

The use of anthropomorphic terminology when dealing with computing systems is a symptom of professional immaturity.

By claiming that they can contribute to software engineering, the soft scientists make themselves even more ridiculous. (Not less dangerous, alas!) In spite of its name, software engineering requires (cruelly) hard science for its support.

In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included.

Projects promoting programming in "natural language" are intrinsically doomed to fail.

Isn't this list enough to make us uncomfortable? What are we going to do? Return to the order of the day, presumably.......

The most impressive thing about this list, at least to me, is that you could say all of this today and still be wrong only about APL.

So why do I think he would have changed his mind about APL if he'd taken the time to learn something about it? Because of the programming language desiderata he laid down in a 1985 interview with Rogier van Vlissingen -who, by the way, sometimes contributes to the talkbacks here.

The interview as a whole is worth reading, but here's an important bit, starting from an exchange of particular interest to those who, like me, decry Microsoft's role in both defining and meeting consumer expectations about PC software. (Note, my comments are in brackets, like this.)

[Dijkstra] ... User satisfaction is not a quality criterion for a computer product.

[Rogier] What would make you say that?

[Dijkstra] I was introduced to user satisfaction as a quality criterion years ago and I found it ridiculous. You can achieve it in all sorts of ways. For instance by not educating your customers; telling them that it cannot be any better. Postulating that the fact that software has bugs is a law of nature. You can even achieve it by intimidation. The worst part of it is that the goal is too fuzzy to give any technical aid. No, user satisfaction as a criterion is very typically American, because the American interface between the supplier and his customers, is that a supplier leaves the last stages of control to the customer. That is why you always see "satisfaction guaranteed" and if not you may return it: they'll mend it endlessly.

Same thing has happened with the design of programming languages. Programming languages of course have to be user friendly, programmers have to like them. For a long time features of programming languages were included on account of their supposed popularity, the main criterion was will people like it, again stemming from a rather dismal perception of the user. The great change in programming languages came from the fact that we started giving formal definitions of the semantics of programming language concepts. Something you need if you want to be able to prove things about the programs written in it. And even if you do not intend to do that in a formal way, it was an extremely healthy exercise for programming language designers. Formalization acted as an early warning system: if the formal definition of a feature gets very messy and complicated, then you should not ignore that warning.

(APL was/is the only formally defined programming language to achieve wide usage and was first sold as a way to formalize the OS/360 design)

[Rogier] Can you explain some more about those language developments that have these better features?

[Dijkstra] The versions that computer scientists are experimenting with are languages that are not selected for their potential appeal to the uneducated, but are screened by criteria such as mathematical power in a very rigorous sense, mathematical elegance and that kind of research produced all sorts of things. Some stay within the framework of imperative programming languages, because we know so well how to implement them very efficiently. Others are exploring functional programming languages.

(That's APL - elegance, mathematical rigor, both imperative and functional programming modes.)

[Rogier] Can you explain the difference between the two?

[Dijkstra] I can try in a minute. Others are trying to be in their considerations even less constructive. They define the answers by a number of logical equations, leaving it to the implementation to find the solution to that set of equations. The net effect of it seems to be that a full system for really acceptable programming will be at the same time a full system that will suffice for the description of constructive mathematics. What is happening is that the gap between a program a computer may execute and the mathematical proof that the answer exists is narrowing.

(Sure, in APL, these are indistinguishable:)

The simplest way to characterize the difference between imperative programming languages and non-imperative ones is that in the reasoning about imperative programming you have to be aware of the 'state' of the machine as is recorded in its memory. In conjunction with that there is a clear distinction between a program on the one end and the information processed on the other. In the case of functional programming you create a language in which you can write all sorts of expressions and evaluating an expression then means massaging that expression until you have it in the shape that you want to have it in.

(Again, a description of APL)

[Rogier] You are speaking of the leading edge of research of computer languages.....

[Dijkstra] Yes, What it is again in danger of being supported to death because one of the hopes of functional programming is that in the execution of programs written in a functional style it will be easier to exploit a lot of concurrency.

(And, indeed, if there's hope today for an APL resurgence it's mainly because the language is perfectly suited to exploit Sun's multi-threaded SMP on a chip - aka Niagara/CMT- much more efficiently than anything else in the inventory.)

[Rogier] How much of this research is at this point flowing back into industry applications?

[Dijkstra] Little but more than you think. I know of a definitely industrial environment, to wit a huge software house that is part of a large oil company and whose charter is primarily to satisfy the software needs of the parent company, so this is most definitely an industrial environment, where in the case of a new program to be developed they made a prototype using a functional programming language in an amazingly short time. In fact in something like better than one tenth of the time an other technique would have required. They have an implementation of this functional programming language, but not a very good one; as a result the prototype was unacceptably slow, but the experience was that it was a very important intermediate step towards the final product. So we have seen already that, though invisible in the final product, novel programming languages and implementation techniques are beginning to play a role behind the scene. This is going to have a profound impact on the software community as a whole, you see the point is that whenever you try to benefit in the development of programs from the availability of machines then obviously the first candidates for automation are those aspects of the programming process that are more or less routine jobs. As a result the really difficult stuff remains. With all our so-called programming tools the net effect is that programming becomes more and more difficult. The easy parts are automated away and the difficult parts remain and that has now reached the stage that it requires of the software developer quite a degree of mathematical sophistication. I was very amused when some time ago in a strictly industrial environment I observed a heated discussion. The discussion involved a whole bunch of so-called higher order functions; higher order functions are considered too fancy to even talk about to many mathematicians, they are functions that have functions for their argument and may return a function as a value, it was in a group of industrial computer scientists and they talked about higher order functions as if they were the most normal thing in the world.

Fundamentally Dijkstra wanted a programming language that can handle both symbolic and numerical computation, is logically equivalent to a mathematics notation, and can extend the notion of recursion to functions (and operators - although he doesn't say that) that can produce, parametize, and execute other operators or functions.

You could do all that with APL in 1971; by 1980 variants like Dyadic and Sharp APL had made the capabilities explicit while modern descendants like J directly implement many of the key formalizations; thereby merging theory and practice.

So what happened? Well, how do you think the average COBOL programmer reacts to something like this (highly efficient) GCD function? (Note: I know I didn't write it, but have no record of who did).

Right. That, and I think the general acceptance in the 80s APL community of the idea that the character set was a barrier to acceptance was a serious mistake confusing marketing with value.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.