Last June, Apple unveiled its first major new product line in years: The Apple Vision Pro. This combination of virtual reality and augmented reality is Apple's first attempt at making "spatial computing" attainable for normal people β or, at least, normies with a lot of disposable income. It costs $3,499 and requires a hefty battery with a wire dangling down to your belt, so this isn't really something your Grandpa is going to embrace.
It's been about eight months since we were introduced to the concept, and now the proper release date is just three days away. Is anyone actually going to drop this kind of money for a first-gen product? Let's check the reviews first.
The magic here is how you interface with the Vision Pro
Will the Vision Pro be the first step toward modern spatial computing in mixed reality as we know it from now on? Maybe. What really makes Vision Pro seem futuristic isn't the display or the apps, it's the input. Eyes and hands. Other headsets have eye tracking and hand tracking, but none have the combination working as smoothly, subtly and intuitively as Vision Pro.
The first few times you use hand and eye tracking on the Vision Pro, it's awe-inspiring β it feels like a superpower. The Vision Pro's external cameras just need to see your hands for it to work, and they can see your hands in a pretty large zone around your body. You can have them slung across the back of the couch, resting in your lap, up in the air with your elbows on a table, pretty much anywhere the cameras can see them. It actually takes a minute to realize you don't have to gesture out in front of you with your hands in the air β and once you figure it out, it's pretty fun to watch other people instinctively reach their hands up the first time they try the Vision Pro.
The way Apple tries to keep users connected while using big honkin' goggles is kind of off-putting
The first innovation is EyeSight, which allows others to see a digital version of your eyes when they try to engage you while you're wearing Vision Pro. So while you're watching content and they walk into the room they might see a shimmer of light on the front display, but as they get closer and start to chat with you you'll see the person break through into your view and they'll see your eyes. It's a subtle effect and a bit creepy looking, but it's effective. Still, there's no substitute for, you know, briefly taking off the headset.
FaceTime works well. You see a clear video of the person you're calling on a screen in front of you. But they don't see you. Or, not the real you. They see a 3D-rendered version of you called a digital Persona. It's still in beta, and mine looked like a much older version of me. My colleague thought I looked like an 80-year-old man. My wife laughed.
It can feel like you're paying a premium price to be a beta tester, but it is undeniably cool
Apple's headset has all the characteristics of a first-generation product: It's big and heavy, it's battery life sucks, there are few great apps and it can be buggy. And come on, have you seen what this thing thinks I look like?
Yet so much of what the Vision Pro can do feels sci-fi. I'm flicking apps all over my home office. I've got multiple virtual timers hovering over my stove. I'm watching holograms of my kid petting a llama. It's the best mixed-reality headset I've ever tried, way more advanced than its only real competition, the far cheaper Meta Quest Pro and Quest 3.
TL;DR
There is so much technology in this thing that feels like magic when it works and frustrates you completely when it doesn't.
So my bottom line on the Vision Pro is that it's definitely revolutionary, but it's a revolution very much in progress.
The Apple Vision Pro launches in the US on February 2, 2024.
[Image: Apple]