Programming Languages I’ve Learned

Image courtesy of Wikipedia
Sinclair ZX-80: Image courtesy of Wikipedia

Aside: I apologize for the ridiculous length of this post. Please feel free to skip over it or ignore it (obviously you are exactly this free). It’s something of a brain-dump on my part, and related to my consideration of the new Go programming language, about which I will shortly pontificate.

I’m a lazy person. I even have a degree in laziness (i.e. Pure Math). (And Comp Sci is Applied Laziness.) As a result, I’ve made my career in large part as a programmer, and I’ve always been fascinated by computers. Well, ever since I found out that they exist, anyway. I’ve also tried to learn and acquire expertise in as few programming languages as possible over the years, as a result of which I’m only proficient in twenty or so.

The first computer I ever used was some kind of minicomputer at a CSIRO research facility we visited on a school outing.

HP-65: my first hands-on programming experience was with a friend of the family’s HP-65 calculator. This device — which cost around $800 in 1977 — was a programmable calculator with something like seventy steps of programming memory (programs were essentially macros comprising recorded keystrokes, branches, and pauses for user input), a magnetic card storage device, and it could run a version of Lunar Lander — the hottest game of its era. (I managed to win the HP-65’s Lunar Lander game after a few attempts, which positively boggled the mind of the calculator’s owner. /flex)

Canon Canola: it’s hard to imagine Canon being so daft as to name a product “Canola” today, but the desktop programmable calculator from Canon was in fact named the Canola, and was programmed using punchcards — you punched the cards manually using a pencil. (And corrected mistakes by painstakingly taping “chads” back in place. The Canon Canola had an immense amount of memory (hundreds of steps) and many nice features (you could insert “named” flags and branch to them instead of jumping to fixed addresses, for example) but wasn’t “scientific”, so if you wanted a sin function — well have fun implementing a Taylor series (or whatever).

Sinclair Cambridge Programmable: having had a taste of programming, I was obsessed. Soon, I started seeing ads in the Scientific American (which was my equivalent of the Sears catalog) for the unbelievably cheap Sinclair Cambridge Programmable calculator. Eventually I got a Radio Shack branded version ($50 in Australia) and started learning to master it. This was a huge challenge after the comparative luxury of the HP-65. To begin with, there was no storage — once you turned off the calculator, your program was gone. So you needed to carefully write down each program to be able to use it later. Second, it had much less memory — 36 steps — and unlike the HP-65 each keystroke counted as a step (on the HP a number of any supported level of precision counted as a step), so the numeric constant 2 was two steps (start numeric constant, 2).

This barely useful product introduced me to the concept of re-entrant code — many sample programs used the trick of jumping into a program at different points to reuse it for different purposes (although each branch was three steps). The lunar lander program provided was so short of memory that it required the user to write down their altitude and velocity and re-enter them at appropriate junctures. I shaved steps from it by reducing the precision of values like gravity (1.6m/s became 2 m/s saving two memory steps, for example) and using X-Y register swaps to use the Y register as an extra memory (the Sinclair had one memory compared to the HP’s 6 + stack) and eliminated everything except having to initially enter your altitude.

Apple II BASIC: in 1980 my High School ran a fund-raising drive to buy the school an Apple II. Yes, that’s how expensive they were in Australia (about $3000 for a complete system including floppy drive, printer, and color monitor — a bit less than the price of a cheap new car). In 1981 the school offered its first Computer Studies course, and I spent every spare minute at school (and quite a bit of time after school) messing with it. I learned both variants of BASIC (the MS-derived floating point BASIC, and Woz’s integer BASIC — although I had no idea of the origins of either) and wrote a lot of code, some of it pretty impressive for the time (e.g. I wrote my own variant of “Adventure” because I found the Adventure distributed for the Apple II to have a lame parser and no combat/damage model — so, without knowing what a parser was, I wrote a parser and implemented rudimentary combat mechanics. I also implemented a good portion a tabletop naval simulation I owned and loved (Sea Strike), although I never figured out how to implement decent AI.

With BASIC on the Apple II you pretty much couldn’t write a decent hi-res graphics game — it was just too slow. To do so you needed to code in 6502 assembler and here I ran into a brick wall. The “best” book on the subject took me two months to get by mail order, and when I read it I found it incomprehensible — I just didn’t have the background knowledge to use it, and as by far the most advanced programmer I knew, I had nowhere to go for advice. (It didn’t occur to me to go to the University and find someone to ask. I was so stupid in some ways…)

Sinclar Cambridge ZX-80: having been burned by Sinclair once, I persuaded my parents to get me their next incredibly cheap and borderline useless product — the ZX-80 (with 1kB of RAM). The best thing about the ZX-80 (and ZX-81 — which my machine ended up being kind-of upgraded to) was its editor/language combination. It had an incredibly horrible membrane keyboard which made typing horrific. To help make up for this it figured out which commands you were trying to type from context making most commands a single keystroke. This also let it make efficient use of RAM (commands were stored internally as tokens), and the editor would not let you enter a syntactically incorrect line of code (you could still blow up via logical errors, e.g. not terminating loops or dividing by zero), and the cursor would change to tell you what was expected next in the current line. The Adventure game I mention above was written in an afternoon on the ZX-80 (which, at the time, had a whopping 16kB of RAM) and then took me a month to port to the Apple II — that’s how much better the dev environment on the ZX-80 was. It sucked in every other conceivable way, though.

Pascal: with College came Pascal. For me, the great thing about Pascal was how it handled type structures (BASIC didn’t have those), encapsulation (named functions, procedures, and parameters), scope (!). But, even though it did so relatively cleanly, Pascal still forced you to deal directly with pointers — something I simply loath and despise at a visceral level — and had incredibly poor string handling compared to BASIC.

MACRO-10: my academic introduction to assembler was in second semester of Comp Sci. MACRO-10 was PDP-10 assembler (we ran it on a virtual machine on a DEC-10 — presumably to avoid having the system hard crash every five minutes when assignments were close to due). The principle lesson of programming in assembler — for me — is that it’s not so much hard as tedious, and it’s very worthwhile to understand exactly what you’re asking the machine to do in a high level language (e.g. when you call a function you’re probably pushing variables and pointers onto a stack and then executing a jump to some new — random — spot in memory and eventually jumping back and taking the return result off that stack), but I wasn’t interested in doing it myself. I’ve essentially avoided assembler ever since.

The first year of Computer Science essentially drove me away from Computer Science. For about six months I went Cold Turkey on computers.

One of Apple's sample images shown in MacPaint. Within an hour of using a Mac, I was able to draw pretty decent pictures with a Mouse.
One of Apple's sample images shown in MacPaint. Within an hour of using a Mac, I was able to draw pretty decent pictures with a Mouse.

Then, in the middle of 1984, Apple presented the Macintosh to a packed room. While everyone else went in to watch the presenters, I took the opportunity to use the demo machine in the lobby for the duration (missing the presentation). Oh. My. God. The first time I closed a file and the computer said to me “There are unsaved changes, do you wish to continue?” I was gob-smacked. I then went back and tried opening a file, NOT making changes, and closing it. Nothing. Then I tried going in — making changes — saving, making more changes. It knew! What about opening a file, making changes, hitting the — miraculous — undo, then closing? It freaking knew. (Only a couple of years later would I find out just how horrible it was to implement this wizardry.)

I don’t know if HP demoed the HP-65 on college campuses back in 1974. If so, it may have had a similar impact. The HP-65 was, in many ways, a similarly revolutionary device. But the Mac was a staggering achievement — the kind of achievement that impressed you more the more you knew. The Apple II first of all wasn’t particularly unique, and second it basically gave you an inferior command line, an inferior computing experience, not very much horsepower — in essence something less capable in every way than terminal access to a DEC-10. The Lisa — like the Xerox Star — was an impressive demo with a stupefying price tag — although, unlike the Xerox Star the Lisa was actually usable. But the Mac was simply better in every way than anything else out there. (Purportedly, one of the professors in the Computer Science department took a Mac into his classroom later that year and put it on the bench in front of the students and said “every other computer is obsolete, this box has 75% of the computing power of a Vax”.)

So I fell in love with computers again, but this time for what I could do with them (e.g. draw, write, desktop publish) instead of what I could program them to do. In fact, I lost all interest in programming, and became interested in using other people’s programs. (Also, once you’ve used MacPaint, the bar for writing your own programs is raised rather ridiculously high. If you’ve never used MacPaint, let’s just put it this way: Photoshop is merely an obvious extension of MacPaint taking advantage of newer hardware, but with less elegance and usability.)

Modula-2: Alan Kay famously said, “People who are really serious about software should make their own hardware.” Almost by extension, if you’re really interested in using software you should make your own. My love of computers reborn, I started doing Comp Sci again. This time, with Modula-2. But Modula-2 was just Pascal with some minor improvements, and Sun Workstations were just like the old mainframes, but faster (and, I’d have to say, with better editors). OK, I loved computers, but I didn’t love Comp Sci.

What I chiefly did with computers was design games and write articles about games. After I’d graduated, published my pen-and-paper RPG, and failed to find a job doing anything remotely interesting, I posted an ad on the noticeboard at the Math department where I still hung out. The thrust of it: “anyone interested in developing next generation computer games, meet me at X”. Two guys showed up, both comp sci students a little younger than me. I gave an outline of what I was thinking about — one of them immediately decided this sounded insanely complex, the other became a lifelong (thus far) friend. We eventually published a ground-breaking and commercially unsuccessful game. Andrew went on to write one of the top low-level debuggers for the Mac (Classic) platform, RealBasic, and a bunch of obscure but highly profitable text retrieval and display code (he also wrote, in an evening, a PDF viewer that ran in 1MB of RAM and ran hundreds of times faster than Acrobat — this was back in about 1995 — he was forbidden by the company he worked for from ever releasing it or continuing to work on it). Andrew’s company was recently acquired by Oracle and he’s some kind of bigwig there.

HyperTalk: when I first got my hands on HyperCard (I was playing with some brand new Mac SEs at ADFA) it was almost as revelatory an experience as the Mac. HyperCard was, in essence, the first real IDE. It had some quirks that differentiate it from modern IDEs, but most of these actually weigh strongly in HyperCard’s favor. E.g. in HyperCard your document (“stack”) was everything. You simultaneously modified your programming environment, and your document, and your program — all in an environment that was fully integrated, and a single — extremely capable — programming language, and everything was persistent. In fact, if you wanted to reset state to a fixed starting point you had to program that — by default state was persistent.

The big missing things from HyperCard were support in the language for media types (it was a little too early for this to be an obvious feature when it was released, and like many revolutionary Apple products it conspicuously failed to get updated once the magnificent 1.0 was released) — which didn’t prevent MYST from being developed in HyperCard, native user look and feel, and associative arrays. Pretty much every serious HyperCard programmer had a library of homegrown routines to implement associative arrays via HyperCards incredibly good string support, but using strings as “data soups” is and always will be a kludge.

To give you a rough idea of how productive a tool HyperCard was, I wrote a relational database program (from scratch — as in from not having any RDB functionality in HyperCard) for a small government office using HyperCard in a few evenings. (This act of wizardry actually changed my life — I moved from being a near suicidally depressed salaries clerk in a personnel department to an internal database consultant within a few weeks, then from that to a job in a multimedia startup four months later.)

4th Dimension: my three months as a CSO2 in the ACT Government comprised picking a multiuser-database package for a large Mac-only government department (I picked 4D) and then implementing three database solutions using it. 4D’s Pascal-like programming language is an absolute travesty — full of gratuitous differences from actual Pascal that are just infuriating, and full of French typing conventions (e.g. French quotation marks are — or were — syntactic elements). Ten years later I would be offered a very well paid lead programmer job at a tech startup using 4D and then quit the job after three days when it finally sunk in that 4D was just as retarded after ten years of development as it had been in 1991 (e.g. you still couldn’t implement abstract events — such as “record update” — but only form events, and nothing about the language’s syntax had been fixed).

Authorware: Authorware was a groundbreaking multimedia “authoring” tool (what a horrible overloading of the word “author”), Authorware’s internal “programming” language comprised a graphical flowchart combined with an Excel-like macro language that conspicuously contained no loops, branches, or control structures. (For those, you used the flowchart.) This made writing and debugging code an unholy mess (an example of the path to hell being paved with good intentions). The whole reason the startup I went to work for used Authorware was because it promised seamless cross-platform development — code on a Mac, publish to Mac and Windows. But the original assumption of the developers was that any heavy duty coding would be implemented as plugins written in C or Pascal. Cross-platform C and Pascal remains a sore point, so the best way to avoid platform problems (and coding in C or Pascal) was to do the hardcore coding in Authorware itself. I ended up writing a relational database system in Authorware itself, along with numerous other things which would make the local Authorware rep’s jaw drop (completely custom navigation, global user notes, bookmarks). I despised Authorware, but I was damn good at it.

Object Pascal: while Andrew coded the game engine for Prince of Destruction in C and assembler, he knocked together a graphical content builder using Object Pascal and MacApp 2. In order to lighten Andrew’s burden, I ended up taking over development of Mapper, which involved learning Object Pascal and MacApp 2 — a very nice class library for writing GUI apps. Object Pascal’s chief difference from Pascal was support for class-oriented OOP (as distinct from nearly classless OOP which you seen in languages like JavaScript). This did a much better job of avoiding pointers, making Object Pascal the first “real” (as in “compiled”) programming language I actually liked.

AppleScript_Editor_Logo

AppleScript: this is one of the worst programming languages I’ve ever used. It’s been described as “read only” and this is pretty apt. Superficially, it looks like HyperTalk, but while HyperTalk is loosely typed (everything is internally a string or a number, and you don’t need to care which), AppleScript is strongly but implicitly typed, leading to endless frustration and incomprehensible errors. This was not helped by Apple’s atrociously lacking documentation (I only got anywhere with it by buying a very expensive hardcover book, whose title escapes me). Automator is a brilliant effort at exploiting the underlying system hooks that AppleScript uses, but AppleScript must rank as one of the most epic design failures of Apple’s history.

Visual Basic: a simply stunning achievement, Visual Basic was probably — in a sense — the best thing Microsoft has ever done. Just as Apple took the Xerox PARC GUI and made “obvious” changes to it to create something vastly more capable and better in every way, Microsoft took HyperCard and made “obvious” changes to it to create something better in many ways. It treated media types — well images — as a nearly native type. It let you build native UIs. OK it ran way slower than HyperCard, and was far less productive to code in than HyperCard (because you couldn’t customize the dev environment for the project you were working on), but you could produce a professional-looking end-result, and the internal programming language was far less horrific than Authorware’s. After three years of multimedia development with Authorware, I went to work for a large consulting company and started developing multimedia and “front office” apps using Visual Basic.

HTML: the chief reason I learned HTML was to create a website for Prince of Destruction (the link is to — essentially — the original HTML). HTML isn’t really a programming language, but it’s worth mentioning as my first foray into web development. My proudest achievement with early web development was figuring out how to produce really nice animated GIFs (versus the revolting ones you usually see) exemplified by the animation in this page here. OK, nothing much to boast about — but it was stolen quite a bit, and I got quite a few emails asking me how the frack I did it. (Hint: if you “render clouds” in Photoshop and your image has power-of-2 dimensions, the result will tile.)

Director/Lingo: developing multimedia with Visual Basic was a bit like teaching a dog to walk. Sure, you could do it, but… Once Director 4 came out its superiority to Visual Basic (or anything else) for multimedia development was simply overwhelming. I ended up making an end-run around my boss (who was dedicated to Visual Basic development — not so much by temperament as company policy) and showed a mockup of a major product we were building for them in VB build in Director. Our next version was built in Director — but not by me! Instead we hired the local Macromedia distributor’s top instructor to write it, and I was only called in when performance and bugs were found to be so bad that the client was threatening to walk out. I ended up “rescuing” the first project and writing a new codebase for follow-up projects.

Toolbook: this was a simple Windows clone of HyperCard but — as with Visual Basic — with the “obvious” deficiencies fixed. It supported color natively, and its controls were native. You could produce decent looking software with it. Like HyperCard it was astonishingly stable (it’s amazing how flaky tools were back then — HyperCard and Toolbook are probably the only 90s multimedia development tools that could remotely deserve the adjective “stable”). We ended up developing an astonishingly rich proof-of-concept demo in Toolbook (integrated with the VB/VC client software) in a matter of one week (albeit 18h days), won a huge contract, and then promptly failed to deliver by switching to VB and Robohelp for development. (Most of my “proof of concept” code tends to be robust and functional enough for deployment — I’m weird that way. I don’t think the “real” system ever got significantly more functional than our hack demo.)

Java: relatively early (as in 1996) I got interested in replacing Director et al with web technologies. I thought the web was The Future and every technology would be eventually subsumed by the Web. Ironically, I then — stupidly — avoided learning much about web technologies for some time — in large part because I got burned so badly by Java. Java did one thing Object Pascal nearly did but failed to do — it got rid of pointers while being a real language. In every other respect, Java is pretty much mediocre-at-best. From an end user’s point of view it was unbelievably slow (now it’s just slow to load) and ugly — but hey, it was neat that it “just worked” more-or-less everywhere. From a language point of view it was a clumsy language design (OK, it was nicer than C++) but it was cool that it had a virtual machine and a lot of libraries. From a library viewpoint the libraries were terrible, but it was nice that the language and VM were there. And from a VM point of view it was a very poorly implemented VM. My first serious Java project involved implementing navigable 3D panoramas (kind of like QuickTime VR, which was our inspiration). At the time, QuickTime VR was really impressive for end-users and a royal pain in the ass to develop (pretty much par for the course for Apple 1984-2003), so we decide to hack together an alternative. Essentially — render 360 degree panorama, scroll it within a view port, define “hotspots” with links in the web page and have them implemented by the “plugin”. I developed two versions of this — one in Java and one in Director/Shockwave. The Director version was so stupendously superior in every respect I’ve never used Java again. (The only “multimedia” features I was using were drawing a picture and drawing lines and rectangles.)

Games were always my first and best reason for using computers, and one day — out of the blue — a guy who I’d had passing contact with while trying to find a publisher for Prince of Destruction called me and asked if I was interested in looking at a game design project. I met with him and a colleague at a conference, he showed me the project, I — off the cuff — gave my opinion on how I’d redesign it — and I was hired. A few weeks later, I’m pitching my design (for which I was never paid, incidentally) to a boardroom full of strangers at a company named Garner MacLennan. It’s a really nice boardroom. There are state-of-the-art 3d graphics framed on the walls, and industry awards everywhere. I’ve just come from work, gotten lost, and run to the meeting in a suit. I stumble in, get asked to pitch my design to prospective developers and money-people for the project. I give a 30 minute pitch — completely unprepared (I’m working 16h days for the Big Consulting Company and I haven’t heard from anyone about the project for weeks or months) — answer a few questions, and am then, basically, dismissed.

A few days later, I receive a call from a gravelly-voiced fellow who would like me to come over to Garner MacLennan for a chat. I end up having a conversation with Stewart MacLennan and Jeff (I forget his surname, but it’s not Garner who had been bought out some years earlier) — the two top guys in the company. Would I like to join them to run an interactive/games division?

C/C++: I’d dipped my toes in the C/C++ world many times before, but if I was going to do hardcore game development — or even just supervise coders — I needed to learn C and C++. So I did. I can’t say I was ever proficient or comfortable — C++ manages to implement all the obviously needed things from Object Pascal (et al) while achieving zero elegance. We licensed a 3D library (from Virtually Unlimited — a Swiss company that has since disappeared from the face of the Earth and even, it seems, the web) that offered both software and hardware-accelerated 3D, and which cost under $50,000 — making it very compelling.

We built a very flexible OO game engine, and started building our game — but we were crippled by several problems. First, it was very hard to hire decent game programmers in Australia at the time — almost impossible, in fact. We could find them in the US, but their salary expectations were freakishly high by our standards, and US coders tend to be ridiculously specialized by Australian standards. (A typical Australian developer is used to doing absolutely everything, Americans want to be told which position in the assembly line they’ll be sitting in. The US system scales better — well, it scales period — but it has a very high cost of entry.)

Chiefly, our problem was that we weren’t three talented college dropouts in a garage — that ship had sailed with Prince of Destruction (only we weren’t dropouts, which itself was a disadvantage because we’d wasted several years getting college degrees) — we had serious salaries, serious lifestyles, no desire to work 20h days, and we had something to lose — like the $250,000/year I could bring in from corporate multimedia development with virtually no effort. So as a AAA game developer, I failed miserably. But I did ship several kids’ titles — two very successful — and make a tidy amount of money on corporate multimedia.

Realbasic: despite continuing efforts to bloat it and turn it into C++ without pointers, Realbasic remains the most elegant real language I’ve ever used, and the most pleasant and productive programming environment for writing desktop apps — at least relatively small and simple ones — I’ve ever seen. I’ve written numerous programs in RB over the years — I was user #2 — Andrew Barry being user #1, was a technical referee for the first edition of O’Reilly’s RealBasic: A Definitive Guide, and ran the most popular RealBasic forum until Real Software finally created its own official forum.

Blitz3D: every time I find out about a new 3D game development tool (that doesn’t cost six figures) I’ll give it a shot. Blitz3D was the first such tool that just grabbed me and didn’t let go. It was Basic, but with the “obvious” problems fixed, and a first-class hardware-accelerated 3D engine just sitting there. I got as far as developing the basics of a space shooter and the basics of a dungeon crawler in it before Real Life took over (i.e. my corporate multimedia money tree died) and I had to get a Real Job. I never really got into BlitzMax, chiefly because proper 3D support has yet to appear, and meanwhile the stupendously superior Unity 3D came out.

PHP: my wife is a Social Psychologist, and a lot of her research involves questionnaires. I think using paper questionnaires is ridiculous, so I suggested she might be better off using electronic questionnaires — and of course it fell to me to implement them. I ended up writing RiddleMeThis for her doctoral dissertation (it’s now in use at quite a few universities, and earns me “Chinese Takeout Money”). To handle web deployment I wanted to use the most ubiquitous possible technology with the simplest possible end-user setup. In other words, I had to pick between perl and PHP. I implemented the basic runtime in both perl and PHP, and quickly picked PHP. I know there’s plenty of prejudice in both directions, but I think in the end that if you love the UNIX command line, you probably think perl is clearly superior. If you don’t, you don’t. I don’t. Back when I first worked on RiddleMeThis, PHP4 was current, PHP3 was still common, and OO programming in PHP was a bit of a nightmare. I’ve recently switched to near fulltime PHP development (some of my time is spent in Realbasic and JavaScript) and am using PHP5’s OO extensively — like everything in PHP it’s a lot uglier and messier than it could be, but it’s not terrible. PHP is the Visual Basic of Web Development — I’ve yet to find the Realbasic.

Flash/ActionScript: I got a job in Web Advertising and naturally had to learn Flash and ActionScript. (Most of the ActionScript in the world is so remedial it makes you weep — kind of like how until about 2004 the chief purpose of JavaScript was to implement rollovers.) ActionScript 2 (and 3, but I’ve never had to do anything much with 3) is a nice language — very similar to JavaScript — and the class library is quite nice, but Flash itself is horrible. (CS4 supposedly — finally — addresses its major shortcomings, such as its astonishingly poor drawing and animation tools (recall that it is, at its heart, a drawing and animation program — the programming language and VM were added long afterwards), but I have, thankfully, not had to do much of anything with Flash for quite some time now.) I’ve wasted enough of your time if you’ve read this far for me to provide a litany of Flash’s lousiness, but a decent examination would be longer than this ridiculously long post.

Python: the reason I got into Python is that for several years my wife worked in a VR lab, and the tool they worked with was programmed in Python. As a result of this, the hairier programming problems always fell to me. I really like Python — it’s kind of close to my idea of a perfect language, except for indentation. I think that semantic indentation is perhaps the stupidest thing I’ve seen since C declarations. If there’s a language feature guaranteed to provide ammunition with which you can blow off your own feet, take your pick. It’s said that in conventional C programming 50% of code (and hence bugs) are related to memory management. In my experience, the thorniest bugs in Python are almost always related to indentation — which is pretty sad, because Python has eliminated a lot of other common sources of bugs (including most memory management) through good language design. Oh well. (The other major failing of Python is that it’s too dynamic to be easy to compile and — frankly — producing reasonably good interpreted languages isn’t particularly hard (I’ve written a few interpreters in my day — nothing major) — producing something like Python that compiled to small, fast binary code — now that would be cool.)

JavaScript: I avoided JavaScript for a very long time, owing almost entirely to prejudice. When you first work with JavaScript there are two things that are likely to infuriate you — or at least there were in 1998, say, when I first started messing with it — first of all you have this huge, mostly/badly undocumented thing to code against — the Document Object Model. That’s not JavaScript’s fault, it’s the browser’s fault. But it gets blamed on JavaScript by people who don’t know better — i.e. most JavaScript coders. The other is the absolute lack of good debugging tools. (This is somewhat, but only somewhat, addressed by modern browser debuggers such as Firefox’s Firebug, and Safari’s and IE8’s built-in debuggers.) JavaScript debuggers were still pretty much a pipedream when I was forced to get serious about it — I implemented a bunch of ad unit types for Fastclick.com (which never were released) and then Valueclick Media (some of which are still in use). It was very educational. Today, I’d say JavaScript is my favorite language of all time — even better than Realbasic. Yes, it has lots of ugly stuff, but you don’t need to use them. Yes, the DOM is revolting, but you don’t need to look at it too often. The chief problem with JavaScript — as with Python — is that it won’t compile to small, fast binary code. If it did it would be The Holy Grail.

Perl: I first grappled with Perl when developing the online runtime for RiddleMeThis, but had to really learn it once I started writing server-code at Valueclick Media. My first example was porting PHP code I had written which dynamically generated ads from an XML feed to HTML/JavaScript to Perl. Later I would write UI code for the admin system in Perl (which made for great demos but for various reasons I think it never went live). Nothing I’ve seen of Perl has made me regret picking PHP for most of my web development. The things Perl programmers seem to be proudest of are things I think are basically shameful — writing code that makes no sense when first inspected. I also think that any language — including PHP and BASIC — that requires you to type some special symbol (like $) in front of variable names was created by a mental cripple (I realize UNIX shell scripts fall in this category, but they have the — partial — excuse of being an innovative hack from yesteryear). If you need to use different special symbols for different kinds of variables, you’re an even worse kind of cripple.

Unity/JavaScript: Unity’s scripting system is based on Mono, which itself is a “clone” of .NET/CLR, itself a conceptual copy of Java (specifically the Java VM and runtime libraries — C# is the Java copy). One of the problems with Java was that the Java VM was kind of seen — it seems to me — as a necessary evil to make Java work rather than a fabulous product in its own right. If Sun had pushed the Java VM as the main product, and Java as merely one example of a language that ran on it among others (e.g. including a less C-like language that didn’t suck) and actually worked harder to make the Java VM something worth pushing, the world would be different. As it was, Sun released Java after years of internal development and quickly Microsoft was able to build a better, faster, stronger VM. Microsoft — the guys who normally take three versions to produce a halfway decent clone of a mediocre product. Anyway, Microsoft saw what Sun didn’t — that it was the idea of a decent “virtual machine” that was the valuable component of Java (not that Smalltalk et al hadn’t done this before).

Anyway, Unity uses Mono to support scripting, which means any Mono-compatible language can be used and what’s more because the CLR lets such languages talk to each other, you can pretty painlessly mix languages in one project without even writing an API. In other words, I can write code in C#, Boo, and Unity’s JavaScript and code in one language can call functions and instance objects implemented in another. Very, very cool. (See a lot of that in the Java world folks?) It’s pretty clear that Unity made the early and wise decision to provide a relatively simple scripting language alternative to C#, and to do this they implemented a subscript of JavaScript they initially called UnityScript, but then renamed JavaScript (for marketing reasons, but creating much confusion thereafter).

Unity JavaScript is — simply put — nearly as nice as regular JavaScript, with the Unity and Mono runtimes to talk to instead of the DOM, and compiled to bytecode — i.e. running at similar speeds to Java, C#, etc. — which is to say in the ballpark of C. This doesn’t make Unity JavaScript the Holy Grail, but it does show how nice a Holy Grail language could be.

Objective-C: everything I know about Objective-C and all my experience with it makes me like it. That said, I really see no reason to commit to it. Indeed, the first and only Objective-C program I’ve written from scratch was a mortgage calculator for the iPhone — so I can’t really claim to be either comfortable or proficient with it — but I mention it because I am pretty familiar with it, I think I get it, and I think that of all the major “real” languages out there (i.e. languages that compile to native binary and can achieve C-like performance when it matters) it comes closest to being The Holy Grail. Unlike C++ or Java, it is almost as dynamic as you want it to be without resorting to special libraries, more syntax, and so forth. And it’s possible to “harden” your dynamic code as much as you want or need to — so you can convert a dynamically linked function pointer into a hard address to eliminate the overhead for dynamic calls — in other words, you can be as C-like in performance as you want without massive rewriting or refactoring, or suffering the agony of strong typing everywhere. The language is also verbose — but in a good way (i.e. a way that doesn’t involve a huge amount of typing, and does tell you what’s going on). C++, Java, and C# are all verbose in ways that bury what’s going on in a bunch of uninformative type declarations and casting that convey no useful information (99% of the time).

Almost all of the problems with Obj-C are practical — you can’t write cross-platform software (even iPhone and Mac OS X software don’t mix). You can’t really write Windows or Linux software as well (unless you target GNUStep or feel like writing your own class library from scratch — neither is terribly practical). So Obj-C remains a very attractive boutique language — more practical than the many even more attractive and boutique languages (mostly Lisp variations) but not practical enough. If, ultimately, your software needs to run in lots of places, Obj-C isn’t your friend. (I currently write utilities used interchangeably on Mac, Windows, and Linux software — I often don’t even look at the Linux versions before deploying them, but apparently they work Just Fine. Thank you Realbasic.)

So, by my count, that’s well over twenty languages I’ve achieved “sufficient proficiency to earn money” with since 1977, despite being lazy and avoiding any programming for about five years. And I’m not including things like Word and Excel Macros, MaxScript, XSL, CSS, VRML, and various other pseudo-languages, that are essentially trivial for anyone with programming experience and the necessary motivation to learn (OK I did mention HTML, but I did say it’s not a real programming language). I’m also not including languages I spent enough time to familiarize myself with (Lisp, Ruby), but was never motivated to build anything real with.

OK, so what about Go?