Serving b8r

In a former life I worked on optimizing delivery of a fairly large website. I won’t pretend I understood a fraction of the detail, but I had a pretty good idea of the big picture and in a couple of places I drilled down to the bottom-most details.

This isn’t new to anyone who pays attention, but scale makes simple things hard.

The basic tricks to getting a web page to load fast are:

Do as little as possible:

  • Make everything as small as possible.
  • Make everything as simple as possible.
  • Be as asynchronous as possible.

Do it as infrequently, fast, and in parallel as possible:

  • Minimize the round-trip time from client to server.
  • Parallelize everything as much as possible.
  • Split stuff up enough to make it parallel, but not so much as to increase the number of round-trips. (To some extent, SPDY/http2 is solving this for us.)
  • Minimize the number of round trips.
  • Maximize cache utilization.

The grandparent of bindinator was bind-o-matic, which was not designed with all of these things in mind. In particular, it made no real attempt to leverage parallelism or asynchonicity. When Craig, Josiah, and I wrote bindinator, the “state of the art” was:

  • Figure out what your dependencies are.
  • Compile them into a big blob.
  • Minify the blob.
  • Give the client the blob.

Bind-o-matic’s approach was: “be so small and light that you don’t need to do clever shit to get good performance” (during development) because you can always do that later. We actually compiled our LESS on the client and it didn’t cause performance problems (once we forked less.js and sped it up a bit, and cached compiled CSS in localStorage).

While almost any javascript web application architecture can served as above, more fine-grained optimization (e.g. trying to get the user to interactivity as fast as possible) is a tougher problem, especially when you have little or no ability to do less (e.g. almost every person in the organization is incentivized to make your application bigger, slower, and more complex.)

And you might be committed to using a big, complicated framework that is virtually guaranteed to make anything it touches, bigger, more complex, and less asynchronous.

Anyway, I designed b8r to be as small, simple, and asynchronous as possible but left delivery optimization “for later”. I assumed I could just point webpack (or webpack2) or some more sophisticated tool (such as the stuff I had worked on) at anything I built later.

I did do one thing, though.

I wrote my own require implementation because I started reading the documentation for existing implementations and my eyes glazed over. In particular, none of them seemed as straightforward to use as the one I’d gotten used to in my former life (and with which I had deep familiarity).

Now, my require is purely client-side, which means that it is a big performance problem. Consider the following code:

const foo = require('foo.js');
foo(17);

The call to require must be synchronous. But what does require do behind the scenes?

  1. Use an XMLHttpRequest to pull “foo.js” from the server.
  2. Wrap a Function instance around the code inside it.
  3. Pass an object to the function.
  4. Return the “exports” property of the object when the function returns.

It really doesn’t matter how asynchronous your code is if every dependency involves halting execution while round-tripping to your server… recursively in the case of a file that, itself, has dependencies.

This behavior actually throws warnings in Chrome…

Chrome no likee
Chrome no likee

Chrome only complains about this the first time it happens, but it happens a lot.

Now the solution for this is to compile your javascript code on the server and deliver some kind of optimized blob — site.min.js say. This is exactly what tools like webpack do — they actually watch your code tree and trigger recompiles on-the-fly. Webpack offers a dev server that actually sets up a backchannel to the client and refreshes the browser automagically when there are code changes.

Sounds pretty good, right? — but it’s about 1/10 as responsive as using b8r and just forcing refresh. I fucking hate having to compile my code all the time, even if all the compiler does is walk a tree and concatenate a shitload of files wrapped in function calls and assignment statements and then call uglifyjs.

But that’s on a local dev server. What happens when you stick this code on a real server and the round-trip goes from ~0ms to ~100ms? It turns out that on the project I’m working on it changes my web application’s spin-up (with nothing cached) from ~600ms to ~1500ms. (Aside: this is a real web application with a shitload of functionality talking to production servers with no back-end optimization. In my past life, loading in ~1500ms from a real server would have caused spontaneous orgasms. When I told people that performance like this was achievable I was assumed to be a naive fool. No, I’m not bitter!)

So, how to do all this stuff on the client?

  1. Make b8r’s use of require asynchronous. E.g. b8r synchronously loads b8r.dom.js before it finishes loading, so load its dependencies asynchronously before loading b8r itself asynchronously.
  2. Get require to warn whenever it loads a module synchronously. Ick.
  3. OK, get require to return a JSON structure of synchronously loaded modules and load them asynchronously before doing anything else.
  4. Repeat step 3 until no warnings. (This took three iterations.)

Loading time from the stage server went from ~1500ms to ~600ms with no server-side optimization whatsoever. Not bad for a late night hack.

Now wouldn’t it be nice if this were all automatic?

I started writing this post on Friday evening, but my first stab at automating this didn’t work and my brain was too fried to fix it.

In order to function, require tracks all the modules it loads, and it already replaces itself recursively to handle nested requires (to allow for relative paths in require statements) so all I needed to do was track one level of dependencies, and then generate a list of preload “waves” where each wave comprises all modules with no unloaded dependencies. (Circular dependencies will be detected and throw errors.)

Oh, and this eliminates the need for b8r to do anything clever internally. The new solution is general and fixes b8r as well as everything else.

const preload_data =/* data from require.preloadData() */;
require.preload(preload_data).then(() => {
  /* ... */
});

So, now the steps are:

  1. Run require.preloadData() in the console, which spits out JSON data.
  2. Now call require.preload(), passing the data from step 1, which will generate a promise of everything loaded asynchronously.

If dependencies change, everything will still work, but dependencies that are force a synchronous request will generate console warnings.

As a nice bonus, this improves the load time of the b8r demo page by over 80%.

Tempted to switch to Windows

I love this laptop. I love the trackpad on the right. Where do I sign?
I love this laptop. I love the trackpad on the right. Where do I sign?

Before you decide my blog is suddenly interesting because I’m a hemi-demi-semi-prominent pro-Apple guy who is switching to Windows, hold your horses. It’s not. I’m not.

The correct, non-linkbaity headline should be:

“WTF Apple?! Macs suck compared to Windows PCs that are more expensive and which you can’t actually buy.”

All users of high-end Macs suffer from PC envy because there’s always a PC out there that scratches a particular hardware itch. E.g. nVidia has just released some insanely nice GPUs and on the Mac if you’re lucky you have a fairly recent mobile version of a mid-range nVidia GPU from last generation. Not even close to the same league. Similarly, the current generation Apple notebooks use Intel’s chipset from last year and are thus limited to 16GB of RAM. 16GB of RAM is so 2012 for fuck’s sake. (Heck, my 2012 Mac Pro has 36GB of RAM, it’s really just a 2010 Mac Pro, and it’s not even trying.)

So this morning I read another “fuck you and your lame-ass hardware and USB-C ports, I’m switching to Windows” post on Hacker News. I don’t remember if it was on the comments thread of the post or HN itself (since I can’t find either anymore) but there was a discussion of what Windows laptop (“other than a Dell XPS”) to get if you want to switch and don’t want a piece of shit (i.e. pretty much any Windows laptop). The replies were illuminating (including quite a few saying pretty much, ‘what do you mean “other than a Dell XPS”, that’s a piece of shit’ — a sentiment with which I can agree based on first person experience), and pointed at the Razer Blade and Razer Blade Pro.

So I took a look.

These are sold as high end laptops — CNC milled chassis, backlit keys with programmable colors, 1080P or 4K displays, (for which they provide an API, so you can have keys light up indicative of, say, the health of your team-mates in a multi-player game) and fantastic specs (e.g. nVidia 1080 in the Pro). In fact the specs were so good I simply wanted to know two things:

  • What is the battery life like?
  • How much?

The answers were:

  • We aren’t going to tell you.
  • More than a Macbook Pro (and, really, fair enough!), and by the way:
  • We don’t have any to sell and can’t tell you when we will.

Aha! Checkmate Apple. I guess there’s a reason why the laptops out there you can actually buy are either worse than Apple’s or cost about the same.

HyperCard, Visual Basic, Real Basic, and Me

When the Mac first appeared it was a revelation. A computer with a powerful, consistent user-interface (with undo!) that allowed users to figure out most programs without ever reading a manual.

I can remember back in 1984 sitting outside the Hayden-Allen Tank (a cylindrical lecture theater on the ANU campus that tended to house the largest humanities classes and many featured speakers) playing with a Mac on display while Apple reps inside introduced the Mac to a packed house. (My friends and I figured we’d rather spend time with the computer than watch a pitch.)

How did undo work? It wasn’t immediately obvious.

When we quit an application or closed a document, how did the program know we had unsaved changes? We checked, if the document had no changes, or the changes were saved, the computer knew.

We were hardcore math and CS geeks but computers had never, in our experience, done these kinds of things before so it took us a while to reverse-engineer what was going on. It was very, fucking, impressive.

But it was also really hard to do with the tools of the time. Initially, you couldn’t write real Mac software on a Mac. At best, there was MacPascal, which couldn’t use the toolbox and couldn’t build standalone applications, and QuickBasic, which provided no GUI for creating a GUI, and produced really clunky results.

To write Mac programs you needed a Lisa, later a Mac XL (same hardware, different software). It took over a year for the Mac SDK to appear (via pirate copies), and it was an assembler that spanned multiple disks. Eventually we got Consulair-C and Inside Macintosh but, to give you an idea, the equivalent of “hello world” was a couple of pages of C or Pascal most of which was incomprehensible boilerplate. The entire toolbox relied heavily on function pointers, really an assembly-language concept, and in some cases programmers had to manually save register state.

No-one’s to blame for this — Xerox provided much cleaner APIs for its much more mature (but less capable) GUI and far better tooling — the cost was a computer that ran dog slow, no-one could afford, and which actually was functionally far inferior to the Mac.

The first really good tool for creating GUI programs was HyperCard. I can remember being dragged away from a computer lab at ADFA (where a friend was teaching a course on C) which had been stocked with new Mac SEs running HyperCard.

For all its many flaws and limitations, HyperCard was easy to use, fast, stable, and forgiving (it was almost impossible to lose your work or data, and it rarely crashed in an era when everything crashed all the time). Its programming language introduced a yet-to-be-equalled combination of being easy to read, easy to write, and easy to debug (AppleScript, which followed it, was horribly inferior). When HyperCard 2 brought a really good debugger (but sadly no color) and a plugin architecture, things looked pretty good. But then, as Apple was wont to do in those days, Apple’s attention wandered and HyperCard languished. (Paul Allen’s clone of HyperCard, Toolbook for Windows, was superb but it was a Windows product so I didn’t care.)

Eventually I found myself being forced to learn Visual Basic 3, which, despite its many flaws, was also revolutionary in that it took HyperCard’s ease of use and added the ability to create native look and feel (and native APIs if you knew what you were doing, which I did not). With Visual Basic 3 you could essentially do anything any Windows application could do, only slower. (HyperCard was notably faster than VB, despite both being interpreted languages, owing to early work on JIT compilers.)

After using VB for a year or two, I told my good friend (and a great programmer) Andrew Barry that what the Mac really needed was its own VB. The result was Realbasic (now Xojo) of which I was the first user (and for a long time I ran a website, realgurus.com, that provided the best source of support for Realbasic programmers). Realbasic was far more than a VB for the Mac since it was truly and deeply Object-Oriented and also cross-platform. I could turn an idea into a desktop application with native look and feel (on the Mac at least) in an evening.

When MP3 players started proliferating on Windows, I wrote an MP3 player called QuickMP3 in a couple of hours after a dinner conversation about the lousy state of MP3 players on the Mac. By the next morning I had a product with a website on the market (I distributed it as shareware; registration was $5 through Kagi — RIP — which was the lowest price that made sense at the time, I think Kagi took about $1.50 of each sale, and I had to deal with occasional cash and checks in random currencies).

Over the years, I wrote dozens of useful programs using Realbasic, and a few commercially successful ones (e.g. Media Mover 3,  and RiddleMeThis) and an in-house tool that made hundreds of thousands of dollars (over the course of several years) with a few days’ effort.

Today, I find Xojo (which Realbasic rebranded itself to) to have become bloated, unstable, and expensive, and Xojo has never captured native look and feel in the post-Carbon world on the Mac, and anything that looks OK on Windows looks like crap on the Mac and vice versa, which undercuts its benefits as a cross-platform application. Also, my career has made me an expert on Javascript and web development.

So my weapon of choice these days for desktop development became nwjs and Electron. While web-apps don’t have desktop look and feel (even if you go to extremes with frameworks like Sproutcore or Cappuccino), neither do many desktop apps (including most of Microsoft’s built-in apps in Windows 10). Many successful commercial apps either are web apps (e.g. Slack) or might as well be (e.g. Lightroom).

I mention all of this right now because it closes the loop with my work on bindinator — anything that makes web application development faster and better thus helps desktop application development. I think it also clarifies my design goals with bindinator: I feel that in many ways ease of development peaked with Realbasic, and bindinator is an attempt to recreate that ease of development while adding wrinkles such as automatic binding and literate programming that make life easier and better.

Email & Equality

Since today is inauguration day, my thoughts are turning back to the last eight years and how we came to be inaugurating a Republican president, again, despite the fact that most Americans disagree with the GOP on most matters of substance.

It’s Not About Women

First off, let’s address the claim that Hillary lost because of American sexism.  Yes, Donald Trump is an unreconstructed 1950s male stereotype (i.e. a horrible human being), and many Americans — including many women, latinos, and a surprising number of blacks — chose to overlook this, but this ignores the fact that the GOP has been consistently lowering the bar for whom they will nominate for office, and it always causes outrage on the left, and it never matters.

Ike was a general. Nixon was an alcoholic witch-hunter. Reagan was a stool pigeon and an idiot. Quayle was an even bigger idiot. Palin made Quayle look professorial. Republicans don’t care if the president (or a senator, or a supreme court judge) has brains, or even sound character: they just want tax cuts and they’re pretty sure their guy is more likely to give them than the other person.

In fact, it’s quite surprising to me that the first black president turned out to be a Democrat, and the first female candidate was also a Democrat. It’s actually conservatives who tend to nominate minorities because it lets them ratchet up the crazy elsewhere. (Margaret Thatcher. Clarence Thomas. Heck, Neville Bonner.)

Incidentally, this is also the same reason that things like sexual peccadilloes and shady practices that would utterly destroy a Democrat seem to slide harmlessly off Republicans.

By the way, I should pause here and say that this has nothing to do with parties. When the Democrats were the party of White Supremacy and the Republicans were the party of Management it was the Democrats who were similarly immune to charges of corruption and sexual misconduct. When the Republicans subvert democracy today and argue that it’s something “everyone” does, they invariably point out actions of Dixiecrats — the folks who left the Democratic party after Roosevelt put desegregation into the Democratic Party platform and joined the Republicans.

A Thought Experiment

A very popular experimental template in the social sciences is to take some common process, like applying for a job or testifying in court, and compare how well candidates do if you signal that a participant is male or female, black or white, has a prison record or not, and so forth, find out there’s a different outcome (which I imagine there almost always is given a nearly inexhaustible number of disadvantaged categories of people), publish the results, and inch closer to tenure.

E.g. I heard on Radio Lab, and I have no reason not to believe, that if you apply for a job using a stereotypically black male name (such as “Jamal”) you are much less likely to be called back than if you use a stereotypically white male name (such as “Steve”), even if the white CV adds a criminal record. The white name is equivalent to eight years of experience. (This implies to me that whatever criminal record they invented was pretty minor.)

The same kind of study has shown women to be less credited as expert witnesses, less likely to be promoted, and so forth and so on. There’s no doubt a lot of sexism in our society, but I’m pretty sure women aren’t as far behind men as blacks are behind whites (eight years experience or a prison record…), and Barack Hussein Obama is more than a stereotypically non-white name. His middle name is the same as a guy we went to war with twice, and his surname is one letter away from Public Enemy Number One when he ran for office.

Obama was an exceptional candidate — he didn’t just beat Hillary for the 2008 nomination, he beat Biden (whom most Democrats think would have been a better candidate than Hillary) and Kucinich (who was a better Sanders than Sanders). And then he beat John McCain and Mitt Romney, the best candidates the Republicans have had in my lifetime.

Now, let’s look at Hillary. Imagine for a moment that Hillary Clinton were in fact some random male Democrat you’d vaguely heard of with her exact track record (post First Lady, since it’s hard to imagine a man with Hillary’s baggage from being married to Bill). So, forget Whitewater and Lewinski and just think — New York senator with a typically exceptional Ivy League education and legal background but no great accomplishments or distinction who then served as Secretary of State from 2009-2012. Would you elect him?

What if I remind you that Chelsea Manning released 10M State Department cables in 2010 and that despite this our candidate continued to use outdated and insecure email practices in direct contravention of State Department rules of which, apparently, he remained willfully ignorant throughout. What if I remind you that the 2012 Benghazi attack happened on his watch despite repeated requests for upgraded security. And yes, lots of requests are made, but this was in Libya during the aftermath of a war. As yes, it was a subordinate who turned down the requests, but who hired that subordinate?

Oh, and by the way, what Good Things happened in 2009-2012 that our candidate can point to?

I’m not saying Clinton did anything criminal. I’m saying that in any reasonable political system she would have been held accountable for Benghazi, forced to resign, and her career would have ended. Similarly, the email business reflects three spectacular failures of judgement (first: to ignoring security policies, second: to continue ignoring the security policies after an epic security breach, third: to fail to improve said security policies meaningfully after said epic security breach). Again, had she still been Secretary of State when the email business came out, she should have been fired for it, and that alone would probably have ended her political career.

By the way, I choose give her a free pass on the Iraq war vote, because I think she did it as a political calculation, and it was a reasonable choice at the time. (I’m actually far more critical of the far broader, unthinking support for the invasion of Afghanistan.) But for some of my friends her vote on Iraq, alone, is unforgivable.

Trump’s done a lot of shady and unpleasant things to people over the years — spending other people’s money and saddling them with his debts, stiffing contractors, ogling pageant contestants (for sure), molesting women (most likely), but there’s no positive evidence of Trump’s ignorance or incompetence in his chosen profession. He may well be an ignoramus (and bigot) in the same mold as Henry Ford (who nevertheless was a great businessman and provided many jobs to blacks). Hillary is a professional politician and civil servant who can’t use a smartphone or a computer and has made spectacularly poor judgement calls in her chosen profession. (Kelly Anne Conway points out, in reference to Russian interference in the campaign, that the Russians didn’t make Clinton spend money in Georgia instead of Michigan or Wisconsin.)

Trump is (rightly) decried as intellectually incurious. But how is it OK for Hillary to not learn to use a smartphone, or email, or a computer when both are, or should be, a constant part of her chosen profession? Trump is (rightly) decried for having publicly sort-of supported the invasion of Iraq, but being right about that war wasn’t his job.

Trump’s an asshole and a bigot, but he seems to be good at what he does. Elizabeth Warren is a smart person but she tried to go head-to-head with him on Twitter and failed abysmally. I’m not optimistic about his presidency, but sexism is only responsible for putting Trump in the Whitehouse insofar as it was perversely responsible for Hillary being nominated.

How Do We Stop Doing This?

It’s easy to point out the failings of Hillary’s campaign in retrospect. She nearly won despite all of it. The lack of a clear or coherent message. Poor strategy. The weak VP choice. Lousy slogan (“I’m with her”). This should have been easy: the country is in good shape, it’s in far better shape than it was 4 or 8 years ago. Its signal policy is at least an equivocal success. The outgoing president is popular. What. The. Fuck?

The fact that 2012 was even close (despite Romney being a solid candidate) points to a hard truth: the Democrats fucked up Obamacare. They created a barely functional healthcare plan because they figured it would get bipartisan support even when they didn’t need bipartisan support, and ended up with something that barely worked, couldn’t be explained, couldn’t be sold, and then rolled it out slowly and incompetently. And this led to their being annihilated in the mid-terms, which meant little of consequence could be done for the remaining six years.

Remember how exceptional Obama is? He’s been a pretty good, successful president despite Obamacare, not because of it.

The solution is to think of laws as products that have to be sold. Clearly, legislators understand this superficially, it’s why a law enabling a police state is named the “PATRIOT Act”. It’s why a healthcare law that costs poor people premiums they can’t afford for lousy coverage is called the “Affordable Care Act”. But good products are more than simply clever names (and legislators aren’t even that good at names…). Here’s a hint: if you design a product where the main reason for many people to buy it is that they will be fined if they do not, then you have failed. Design a new product.