I don't have and haven't used or even touched a VisionPro, but I am working on building software for it so I can build software with it.

I don’t have and haven’t used or even touched a VisionPro, but I am working on building software for it so I can build software with it. I’ve been thinking about augmented reality as a leap forward in human-computer interaction since at least 1986, when I described how people interacted with computers in my science fiction setting, ForeScene. (And VisionPro is a step on the way to what I called Cursor/Recog.)

I’ve read the Verge and Daring Fireball reviews. CNET and Toms Hardware seem more enthusiastic. All of them basically say that (1) it’s incredibly impressive and (2) it’s not there yet. So the question is, where is it?

Here were my major concerns—as a potential early adopter—having seen the demos but not touched or used the device.

  • I get eye strain after using the Quest 3 or Quest 2 for less than an hour. On the other hand, I was able to use the PSVR headset (which had far inferior graphics) for hours on end while playing Skyrim VR… to the point where it was my legs that gave out.
  • Interacting with stuff using hand tracking on the Quest 3 is abominable. Using the “laser-pointer” style controls on the Quest 3 is pretty bad.
  • How well does the AR side of things work? E.g. are there APIs for geo-positioning user interface elements (ideally using a combination of spatial analysis, GPS, and any other available data the way car AIs figure out where they are.

It seems to me that the reviews I’ve seen answer these questions quite effectively.

  • Eye strain—not an issue.
  • Interacting with stuff—seems like it’s better than using the Meta 3’s controllers while still requiring you to look at the thing you’re trying to interact with. This suggests that dedicated XR apps will need to disclose the UI elements contextually (e.g. next to the object of interest), unlike on a Mac where the menu or palette can be “off to the side”.
  • How well does the AR side of things work? Barely. E.g. the Safari WebXR stuff is still behind flags. Here’s the problem: when you’re building something like this in stealth mode and, presumably, most of the prototypes were likely awkward, delicate, and needed to be plugged in, it’s not like you can go test it out in the world. You can’t even secretly carry it with you and use it in the bathroom like an iPhone prototype. My guess is that the dev team has been thinking about this, building out the tech platform, and putting off the detail work until they can be in the world with the devices.

This last point is crucial to me and may mean I skip the first generation VisionPro. If I want to experiment with geo-positioning content and UI affordances, it doesn’t seem like the VisionPro is any further along than the Quest 3, and perhaps in some ways behind it. No-one cares about this right now, but it’s going to be crucial. And to get anything like this to work right now, I need strong AR and XR functionality more than I need the best displays.

The Verge’s Nilay Patel asks a bunch of questions I hadn’t even thought of:

  • Do you want a computer you can’t use in the dark? Frankly, using a laptop in bed in the dark isn’t a major use case. If you’re alone, why not turn on the light. If you aren’t, just get out of bed.
  • Do you want a computer that messes up your hair and makeup every time you put it on? I’m bald and don’t wear makeup. This is certainly a question I hadn’t considered and it’s definitely going to be an issue for some.

An iPad that runs Mac Apps for your Face

John Gruber points out that the VisionPro is an iPad for your face, not a Mac for your face. On the one hand, this means it has a huge library of software that already works but it also means that it can only subsume the Mac (at least as far as Apple is concerned) when and if the iPad subsumes the Mac. XCode won’t run on the VisionPro any time soon. On the plus side, iPadOS is also already a “de-crufted” macOS for large screen devices.

But, like the iPad, the VisionPro has a text input problem (without a tethered keyboard). Interestingly, the iPad could also be the solution to the VisionPro’s text input and pointing issues with some simple software additions.

Mac software is still designed for users with a keyboard and mouse or trackpad. However good Vision Pro’s eye-tracking is, it’s still a long way behind the precision of the Mac pointer.

The original 44px minimum target size for tap targets on the iPhone has increased as devices with smaller logical pixels, like the iPad Mini and numerous Android devices have appeared. Google now recommends, and I personally use 48px, as a minimum target size, and both Safari and Chrome will ding you for having smaller tap targets with their internal audit tools.

I’ve always hoped we were heading to a time when a single device running iPadOS replaced the Mac, and could run macOS in a “compatibility box”. VisionPro is basically that device, which also explains why iPad OS was split off from iOS. Oh yeah, and the “compatibility box” is just a Mac running in a virtual display.

I wonder if Apple will end up selling headless portable Macs.

Where VisionPro needs to go…

Right now, Apple sells the following devices with screens, roughly speaking:

  • Watch
  • iPhone
  • iPad
  • Macbook
  • Mac with external displays
  • AppleTV
  • Vision Pro
Tablets are a natural companion for the VisionPro. Drawing via hand-tracking isn't really ever going to be viable, and tablets are acceptable in social settings where a headset is never going to be.

So basically, I want/need a bunch of key capabilities from the devices I use.

  • Wearable: can be worn pretty much all the time—e.g. while swimming—doesn’t look embarrassing, and doesn’t need constant charging, etc.
  • Connected: has a fast, shareable cellular connection. Everything can use WiFi, so that doesn’t count.
  • Text Input: I can write this blog post on it without getting frustrated.
  • Precise Pointing: I can work on 3d models or 2d graphics on it.
  • Drawing: I can sketch a design or diagram on it as easily as with pencil and paper.
  • Social: I can use it in a meeting or while having a conversation with friends.
  • Immersive: I can play a serious game or watch a movie on it.
  • Compute: compile software, edit photos, model, animate, and render in 3d
WearableConnectedText InputPrecise PointingDrawingSocialImmersiveCompute
⌚️ Watch😎😎😕
📱 iPhone😕😎😕😕😕😎
iPad😎😕/😎😎😎😕😕😎
reMarkable😕😎😎😎
💻 MacBook😎😎😕😎
🖥 Mac😎*😎*😎*😎
AppleTV😕*😕
🕶 VisionPro😕😕😎😎
What I need and where I can get it. Asterisk indicates that a peripheral is required.

To qualify as wearable, in my opinion, the VisionPro needs to get under 700g including onboard power good for several hours. (DC charging is available ubiquitously, so the big issue right now is that it’s 600-650g with a 353g battery and you need to shut down to switch batteries.) Whether or not wearing a VisionPro in public looks silly is not just up to Apple. The hardware needs to improve, but people will either accept it or not. We’ll see.

Apple seems to have pretty much solved the display problem, but the input problem is only half solved and the social problem is a new frontier.

Meanwhile, as with Apple’s entire product range—except perhaps the Watch—connected is simply something Apple can choose to add to a product. We could have connected Macbooks today if Apple wanted to sell them to us, but I suspect they’re afraid it would cannibalize iPhone/iPad sales.

As you can see from my matrix, the Watch and iPhone/iPad fill the gaps in the VisionPro’s set of capabilities, assuming that Apple provides better seamless support for using your iPhone or iPad as a keyboard and/or trackpad for your VisionPro.

So, if it’s not there yet, where is it?

Shiny vs. Useful is my favorite 2D comparison—with thanks to Suck.com

The way I look at it is that Apple has released serious products (i.e. not iPod Hifi or 20th Anniversary Mac) that have been at various places along the useful dimension, while being pretty consistently very, very shiny. Note that this is how shiny and useful the product was at launch, vs. today.

Apple's new products have almost always been shiny. Useful is another question…

Now, useful is doing a lot of heavy lifting here. I’m not talking about whether the device works, or its overall level of functionality. I’m talking about ecosystem, whether it solves problems people actually have, whether it does the job it is hired to do, etc. Sure, iPod Firewire is rated as more useful, but its job was playing music, and it did it. If we didn’t know that Apple would launch the iTunes Store (which was completely mind-blowing as a business maneuver) and support USB immediately afterwards it would probably be a lot further to the left.

If you look at these products, and accept my assessments, then the two products that have failed—the QuickTake and the Newton—are at the bottom of the useful spectrum.

The Newton was potentially awesome, and definitely showed the future of computing, but consider the tech landscape when it was created. No-one had internet access outside of universities, government, and research institutions, the web was in prototype form, cellular modems were barely a thing. The Newton could “dial phones” by playing tones! There was never a connected Newton device. Inconsistent handwriting recognition was fixed with software (MP110) and hardware (MP120). In today’s world, the software fix would have been a patch one month after release.

Similarly, the QuickTake was potentially awesome, but it took pretty crummy photos and most people’s PCs couldn’t actually display them, even if you could somehow transfer them to your computer. There were no SD cards, most people had 16 color, monochrome, or black-and-white displays. And Apple didn’t own the core technology.

Edit: upon reflection, I think I’ve been a bit harsh on the VisionPro. It probably deserves to be more like 70% of the way towards Useful. Say, more useful than the Watch or iPod Firewire, but not quite up there with the Mac. Here’s a revised chart…

With this adjustment, the VisionPro is actually better placed than the Watch or iPod Firewire, both of which quickly evolved into smash hit products. On the other hand, it’s not as good as the Mac, which was a smash hit in terms of influence, but didn’t make Apple a huge amount of money. If you, like me, remember the late 80s, you’ll know that virtually every personal computer after the Mac came with some kind of graphical UI, usually mouse-driven, and they were all utterly awful until Microsoft Windows 95.

But Apple today is a far better positioned company than Apple in 1985 and, with VisionOS, looks to have a technical lead over the competition that dwarfs Apple’s lead in multi-touch or WIMP interface tech, and the technologies with which VisionOS will need to interoperate are either friendly to Apple or effectively dominated by Apple.

If VisionPro isn’t quite ready, will it ever be?

Until I get a VisionPro, I'm using a Quest 3 to understand VR, XR, and AR

Bottom line, the VisionPro needs to sort out its UI issues (which seem pretty minor and easily solved), it needs to shed 300g (a 700g device with no cord tether would be no harder to wear than a 650g device with a cord), and it needs to fill the whole requirements spectrum with as few other devices as possible—in other words, reduce the number of things a person needs to carry and keep charged.

It took five years for the iPhone 2G to evolve into the iPhone 5. Compared to the former, the iPhone 5 was 112g vs 135g, two 1.3GHz vs one 412Mhz core, basically 24h battery vs 6h, and way, way more pixels. But by 2012 it was clear that all phones were soon going to be iPhones or iPhone knock-offs.

The VisionPro is already competitive with Apple’s other non-pro platforms in terms of performance. It hardly needs any more pixels. Shaving 30% off the weight seems doable given we do not (and should not) expect the crazy level of performance or battery life improvement.

Based on my armchair analysis, the VisionPro looks pretty well placed to succeed. Exactly how well a product does over the next five years isn’t necessarily reflective of its utility at launch. The iPad went from “it’s just a big iPhone” to something most parents with the means to afford one considered compulsory within three or four years, and a big part of that was it replaced equally expensive backseat DVD systems that were generally worse and more expensive and useless for anything else.

And, just as with the iPod Firewire, for a certain subset of users it’s perfectly fine as it is right now. And those users are people who work for big companies that fly them around business class.

The VisionPro costs $3900 with 1TB of storage. A non-stop flight from Sydney to SF costs ~$11,000 return. A similar flight from NY to London can be gotten for $3400. If you work for a big company and you have to travel trans-ocean for work, you are probably entitled to business class. If I had a choice between taking switching to Economy once or twice and buying a Vision Pro with all the conceivable trimmings…

Similarly, if you work at one of the Big Tech companies or similar you probably need to use a crummy privacy screen over your laptop to work on planes or in airports. Imagine being able to use giant private screens in complete privacy? Imagine if it’s literally the same setup you use at work and at home.

For such users, the Vision Pro is just an obvious buy today. And that’s a much bigger and generally wealthier group than the people who knew how to rip CDs and had Firewire Macs.

Open Questions

  • Face display seems like it’s not great. How much lighter and smaller would these devices be without it or with something simpler?
  • Is Apple determined to keep the Mac as its primary platform? Now that Vision Pro is out, presumably this means that everyone at Apple will get to use them and the goal will be to make them do everything. But iPads have been out for 14 years and we still can’t develop apps on them.

What Next?

  • Personas come out of beta. I wonder if Apple will ship this or abandon it.
  • VisionOS 2. Hopefully announced, and perhaps beta, at WWDC.
  • VisionPro 2. Teased at WWDC?
  • iOS and iPadOS support. Might we see a keyboard + trackpad function on the iPad?
  • Mac support. It would be nice to see more display options (vs. a single virtual retina 27″ display).

Anyway, this article is a work-in-progress. I’ll update it over time.