Credit where credit is due

Ritchie vs. Jobs: Fight!

It’s been a bad week for giants of the computer industry. First Steve Jobs and now Dennis Ritchie (the “R” in K&R, or Kernighan and Ritchie, The C Programming Language). If you don’t know who Dennis Ritchie is then, well, you’re probably not reading this. But in a nutshell, he’s one of the key people behind UNIX and the C programming language. The blogosphere is claiming that C is important because it was the first serious attempt to write a cross-platform programming language (presumably for certain values of first, serious, and cross-platform — and then remember not to count LISP or Forth) but really C is important because it was used to write UNIX and was part of UNIX. For roughly the next 20 years, C wasn’t really any more “cross-platform” than Pascal (say). But UNIX it turned out to be the foundation for all modern operating systems, and it was written in C (which made it portable). Many operating systems that aren’t UNIX, such as Windows, are in some sense beholden to it (Windows was built on DOS which was a clone of CP/M which was heavily inspired by UNIX).

There’s no denying the enormous contribution of Dennis Ritchie (et al) to modern technology, but UNIX was successful, in large part, because it was “open source” which led it to become the basis for university operating systems courses. This led to a free clone (BSD) being developed. Which led to the free software foundation. Which made Linux possible. During the 1970s and 1980s C and UNIX beat out technical competition in large part not because they were unspeakably brilliant pieces of technology, but because they were good enough, free, and well-known. If UNIX and C had never existed we could have built our technology stack on top of — for example — Pascal and its descendants instead of UNIX and we wouldn’t be appreciably worse off. Or, if you love curly braces, BCPL. Lots of people independently solved the problems UNIX and C addressed and some did it pretty damn well. Certainly most of the computer language geeks I hung out with in the 80s were hoping C (and C++ in particular) were not going to win the language wars.

Just for the record: Apple’s first big success — the Apple II — was designed from scratch by Steve Wozniak, ran a BASIC interpreter and Disk OS written from scratch by Steve Wozniak. No C. No UNIX. And the Mac? Designed from scratch by Burrell Smith, ran code written in 68000 Assembler by a bunch of legendary hackers, and its main high level language until the late 80s was Pascal (later Object Pascal). The Mac was inspired by Xerox PARC’s work with Smalltalk which was inspired by Englebart’s work (which predated UNIX) and Simula which evolved from Algol which was in fact the language that eventually spawned C (and Pascal) as well. Oh, and Smalltalk was more cross-platform than C (think Java).

But sure, Lisa (and MPW) owed a huge amount to UNIX, NeXT was built on top of UNIX and C, Pixar was probably built on UNIX more-or-less, the iPod’s OS is probably, at bottom, some kind of UNIX clone, and iOS is based on UNIX. It’s an open question as to how much of this stuff was made possible by UNIX versus simply used UNIX because it happened to be the dominant platform through historical accident. To look at it another way: we use AC current to power our homes even though Edison (who backed DC) defeated Tesla (who pioneered AC). Once Edison beat Tesla he switched to the superior technology. Edison gets more credit than Tesla owing to business acumen and historical accident.

How to reassign credit for anything Apple Does

If you’re determined to deny Apple credit for doing anything, here’s a quick guide:

  1. Denial. First of all, everything Apple does is crap, so who would want credit for it? (For extra points — decide it’s crap before you’ve had a chance to use it.)
  2. Just Marketing. Not crap? Well, perhaps it’s not a technical feature but just marketing!
  3. Mere Usability. Maybe it’s merely an easier-to-use version of a feature something else has but no-one can figure out how to use. So it really is just marketing!
  4. Gimmick. Perhaps it’s too easy to use. Perhaps it’s a “toy”.
  5. Technically Inferior. OK, so it is a technical feature — surely it’s not as good or lacks some numerical value (resolution, megapixels, GHz) of some rival product.
  6. Done Before. Is it a mobile device feature? Maybe there’s an Android, RIM, or Palm device that has a similar feature that predates it. Make sure that you don’t mention the Newton. For bonus points mention Grid, 2001, or Star Trek (the original series)
  7. Not as good as imaginary products. OK, there’s no such competing product now, but what about products that have been announced but not released? Rumored? That you just dreamed up?
  8. Acquired from another company. Maybe the product wasn’t invented at Apple but merely acquired when Apple bought the company that really invented it?
  9. Acquired talent. It should be pointed out that while it was invented at Apple, the people who invented learned their stuff before they came to Apple.
  10. Merely software. It’s only software, which isn’t important like hardware. In the end, it was written in C and runs on something based on UNIX so really we need to give the credit to Dennis Ritchie.
  11. It follows obviously from earlier work. Consider Neumann, Turing, Gödel, Maxwell, Faraday, Lovelace, Bacon, Euclid, Pythagoras, or Plato. Once those medieval monks invented symbolic logic it was all downhill.
  12. Steve stole the credit. And if it turns out that Steve Jobs actually did do it himself and you can’t deny or reassign credit in any other way, remember that it was probably Woz who did it for him.
  13. Too expensive. Oh, it really is hardware? Don’t forget to mention that it’s overpriced.
  14. Wake up sheeple! Hmm, no-one else can sell one for a competitive price. Ouch. When in doubt, point out that it’s not open, anyone who uses it is a “sheep”, and Apple has enslaved its users.

Postscript: I note that Forbes just posted an article entitled The Inevitable Steve Jobs vs. Dennis Ritchie Comparison.