view · edit · history · print

Edsger Dijkstra


"Over my career, I wouldn't say that languages as such have been a major influence. Developing language-independent formal coding strategies has proven far more important. I've benefited from the writings of Dijkstra (immeasurably), Stroustrup, Iverson, Backus, Knuth, Plaugher, Stepanov, Brooks, Bertrand Meyer (with reservations), and not a lot else. I haven't learned anything profound from Wall, but thanks for all the Onions.

Machine code gave me a good underlying model of the machine. Essential for many debugging situations, esp. back in the day when compilers would often generate faulty code.

APL taught me the value and power of carefully reasoned primitives, the power *and* risk of concision.

C taught me how easy it is to write a loop that's impossible to validate mentally (and then I taught myself how *not* to write such code).

C++ taught me most of what I know about software engineering: programming in the large. C++ manages to be simultaneously better and worse than almost any other language one cares to name. There is a deep truth there that hardly anybody in the industry wishes to accept.

SNOBOL and PL/1 taught me that kitchen sinks are best used for washing dishes.

Perl taught me that it isn't at all difficult to write a complex regular expression that's harder to read than any APL program I ever wrote. I once had to program in APL on a teletype that lacked the APL character set, so every APL symbol was mapped to its ASCII counterpart based on key location. Reading APL code on this teletype was comparable to reading a particularly hairy Perl regex.

PHP taught me that useful code can be written in a language with no coherent center whatsoever.

LISP taught me that the human brain is not a stack machine. I grew up with Mechano. I don't understand the Lego people while all those identical bricks, and I don't understand the LISP people with all those identical cricks.

COBOL taught me separation of concerns: code should be code, comments should be comments.

Python taught me nothing at all. To me Python is just the metric version of PHP, which spares you the headache of guessing which functions calls are in Imperial or American units (roughly as arbitrary as whether a Wikipedia page uses British or American spellings). To be honest, I learned more from playing around with ChipWits many years ago. But I find Python enjoyable for some reason as much as I've used it.

Pascal taught me that the evolution of a complex program occurs along more than a single dimension. I never enjoyed a single minute of Pascal programming.

By far, I learned the most simply from reading Dijkstra (set aside an hour per page) and practicing the art of coding an algorithm in such a way that by the time you are done, your code couldn't possibly be wrong in any profound way, because you have captured the undiluted purity of essence.

Plaugher helped to convince me that computers are *especially* fast at doing nothing. Whenever possible, when a precondition is not met, I just let the code continue, mostly doing nothing (if every statement is coded not to execute in the absence of its precondition, this is an automatic consequence). When the routine completes, I check state variables to see whether the desired actions were accomplished.

I hate exceptions and have never conclusively demonstrated to myself why exceptions are necessary. I suppose to permit integration with code that *doesn't* rigorously guard every statement. I feel confident about my C++ code until the moment I enable exceptions in the compiler. Then I think to myself: this program could potentially fail in 1000 different ways depending on which exception paths are taken. It took the wizards of STL *years* to make the STL fully exception safe. That troubles me. A lot. More than all the other complaints about C++ piled to the moon and back.

Knuth was wrong about premature optimization. The root of all evil is a premature return path.

There's no act of programming I enjoy more than coding a tricky algorithmic loop and having it work the first time, on every possible edge case. If you take Dijkstra's approach seriously, 90% of your statements derive from your variant, pre-condition and post-condition. It's a bit like solving a Sudoku. Each necessary statement engenders other necessary statements, and suddenly, without feeling like you ever did anything, the loop is finished, and exactly right.

From http://en.wikipedia.org/wiki/Edsger_W._Dijkstra [wikipedia.org] From the 1970s, Dijkstra's chief interest was formal verification. The prevailing opinion at the time was that one should first write a program and then provide a mathematical proof of correctness. Dijkstra objected that the resulting proofs are long and cumbersome, and that the proof gives no insight as to how the program was developed. An alternative method is program derivation, to "develop proof and program hand in hand". One starts with a mathematical specification of what a program is supposed to do and applies mathematical transformations to the specification until it is turned into a program that can be executed. The resulting program is then known to be correct by construction. Unfortunately, the modern art of programming consists largely of guessing how the library functions you are calling will respond to various edge cases poorly documented. Ah yes, I fondly recall the first time I coded an C++ application on top of the Winsock API. Eventually I just broke down and wrote the code to assert out every time a function returned an error code I'd never seen before. Then I did the same with the ODBC calls to Jet. Yes, it was specified (not by me) to work concurrently over a database engine that couldn't possibly succeed. Over the course of testing, I had one function assert out twenty different times before I had handled all the different error codes it managed to generate. You throw a ball at Jet, you get any number of possible error codes about why the ball was dropped: sucking my thumb, hand in my pocket, looking up at the sky, choking my chicken. Are any of these incomplete passes profoundly different to the calling code? I could never figure that out. What is this profession properly called? It sure wasn't programming as I've ever defined the term.

Unfortunately, there's *no* programming language out there that can teach you how to formally iterate your program design over partial, incomplete, ambiguous, and incorrect API specifications.

Nor is there a handy manual for working around what your language forgot to provide. Turns out, you actually *can* learn something deep from JavaScript.

http://www.onjava.com/pub/a/onjava/2006/04/05/ajax-mutual-exclusion.html [onjava.com] While there are classic algorithms that implement mutual exclusion without requiring special support from the language or environment, even these expect some basics that are missing from JavaScript and browsers like Internet Explorer. Unbelievable. The world's dominant web scripting language contains not a single concurrency primitive worth having that necessarily exists. For the sake of his spirit, I hoped Dijkstra passed to a more enlightened plane of existence.

http://nob.cs.ucdavis.edu/classes/ecs150-1999-02/sync-bakery.html [ucdavis.edu]

A thing of beauty. Add Lamport to that list of people worth reading.

http://www.budiu.info/blog/2007/05/03/an-interview-with-leslie-lamport [budiu.info] People fiercely resist any effort to make them change what they do. Given how bad they are at writing programs, one might naively expect programmers to be eager to try new approaches. But human psychology doesn't work that way, and instead programmers will find any excuse to dismiss an approach that would require them to learn something new. On the other hand, they are quick to embrace the latest fad (extreme programming, templates, etc.) that requires only superficial changes and allows them to continue doing things basically the same as before. In this context, it is only fair to mention that people working in the area of verification are no less human than programmers, and they also are very reluctant to change what they do just because it isn't working. At this point in my career, I'd probably benefit more by reading *any* significant paper by Lamport than learning yet another programming language."

admin · attr · attach · edit · history · print
Page last modified on April 01, 2008, at 02:16 PM