My Apple Vision Pro Demo Experience
When I was running errands I stopped into an Apple Store and did the Vision Pro demo. I doubt my impressions of a product released weeks ago, and thoroughly reviewed by experienced professionals, are of any interest but I provide them free of charge for those that have nothing better to read.
As per usual, the flow in the store is really unintuitive. I arrived with my QR code and the greeter sent me to Angela. Angela scanned the QR code and then told me to wait by the MacBook Pros along the wall. That immediate wave of anxiety about being put in an invisible DMV line overcame me, but my wait was really brief. Eric was there to do the demo with me, but Angela was around supporting Eric. Seemed like he was being trained on the process.
Eric handed me an iPhone to do a face scan, and for some reason it didn’t want to read my face when I turned my head to the left unless I moved the iPhone. I’m not sure but I think the backlit store display behind my head might have caused a problem, because every other direction worked both times I did the scan. It did not instill confidence.
Before my appointment I had selected that I use eyeglasses in the app, and so I presented them to be scanned. My prescription is not especially strong, and I can see fine without my glasses but experience eyestrain over time, so I might as well make sure things are as sharp as sharp can be for the demo. The machine that scanned my eyes produced a QR code. Angela was standing to the side during this, and mentioned that it was so Apple didn’t save or know my prescription. Which I guess is comforting, but seems kind of surface-level privacy if I buy prescription inserts? Honestly, I’m just in awe of the QR codes to get more QR codes to get more QR codes.
Then back to the stools where we sat. Eric made idle chit chat about what made me want to do the demo, then the device was brought out on a tray with the inserts. The insert pairing didn’t happen until after calibration, which I thought was weird, but whatever.
Part of the reason I didn’t want to do this was because the thought of a shared face-product creeped me out, but the device appeared to be clean, and not like one of those abused demo products tethered to a display at a Best Buy where the only concern is theft. I’ll update this if I get pink eye.
Eric told me, and then demonstrated, how to pick up the device with my thumb under the nose bridge, and then four fingers on the top and then you flip it. Which… this is just too precious a gesture for a device with this heft, but when I went to pick it up a finger brushed the light seal too hard and it popped off so I guess those magnets really are as weak as everyone says. We can’t have clips? Tabs? A little space-age velcro? It has to be weaker than those flat, business-card fridge magnets?
I’m not convinced that the face scan picked the right light seal for me (felt like all the weight was on my brow and the nose and cheeks keep having light poke through if I raised or furrowed my eyebrow). I tried moving it up and down on my face and tightening the head strap knob. It was never comfortable, but also never felt rock solid on my face. When It was higher on my face things seemed to vibrate by how insecure the lower portion was, if it was too low on my face it was impossible to see clearly.
The inserts needed to be paired, and there was the calibration stuff, and that was all fine. I had expected the passthrough to be dim, compared to the brightness of the inside of an Apple Store, and it sure was. I didn’t see any video artifacts, but it felt like I had sunglasses on that were tamping down the luminance.
What I thought I was prepared for, and I wasn’t, was the intense tunnel effect. In the video reviews I watched attempts were made to create an approximation of the binocular-like effect of having ones field of view constrained, but in my opinion the simulated views have been too generous. What really added to it was the chromatic aberration and how strong the sharpness fell off from the center. How much of that falloff was from the foveated rendering, and how much was softness from my own vision, or these particular Zeiss inserts, I can’t say. It just wasn’t sharp outside of this cone in the center of my vision.
This was most notable when we got to the part of the demo where Eric had me open Safari and he relayed the scripted response about how sharp and readable the text was, and how it could be used for work … and I did not share these feelings. The text I was looking directly at was clear enough, but three lines up, or down was fuzzy, and likewise side-to-side. I never felt like everything was blurry, but definitely made my view feel smaller than if I was looking at an equally large real-world billboard of text.
The chromatic aberration was very distracting when watching media, but I can’t say how much of that is my sensitivity to such visual phenomena, or if maybe I would get used to any of it.
The Spatial Videos also bothered me, but I know they’ve had rave reviews from almost everyone. I see the shimmer and sizzle in stuff that doesn’t appear to be matching, and it’s not “ghostly” like I’ve seen some describe. That video where the kids are crouched in the grass and we see the underside of the kid’s sneakers had a glow on the sole of the shoes and the grass. According to Eric that was shot with a Vision Pro, which means matching cameras, so I have no good explanation for the artifacts I saw, just that my eye was drawn there. Without applying a rigorous round of tests, I don’t know if the artifacts I felt like I was seeing were in the recorded content, or maybe something else. My first step would be to compare left and right eye views, but alternating blinks weren’t enough in the demo.
Similarly the video shot on the iPhone 15 had a lot of the expected shimmer in bits and pieces throughout. I know that this is unobjectionable to every other human that’s looked through these lenses, so I’m just cursed to notice it.
Mount Hood was impressive, but that ever-present green dot for the Control Center really ruins the mood. Also unlike the static panoramas, everything that’s moving has an air of artificiality to it where it seems a little too perfect. Better in appearance than a video game, but the kind of things that underly a video game environment with looping stuff. I wish when I moved my head around that there was something closer to me that would parallax so it felt less like a nodal camera move. No matter how much I loved my head it felt like everything was on a dome and all I was doing was rotating the center of the dome, never moving from the infinite point of the center.
The Super Mario Bros. movie clip in stereo wasn’t a good showcase of stereo or of the cinema viewing experience. It felt like a very large TV 5 feet away from me. I didn’t feel like I was in a theater, and nothing in the clip wowed me. I’m not sure why this particular movie was selected for the demos, as it doesn’t really shine.
My demo did not include the dinosaur. I don’t know why, and didn’t want to derail the scripted experience by asking. Maybe the dinosaur was down for maintenance?
The sizzle reel of immersive content was as impressive as everyone has said —in particular the sharks. I wonder how realistic it is to invest in shooting material like this just for these headsets, but that’s not my problem, and hopefully not a deterrent to Apple recording more of this.
Eric wrapped the demo and started to tell me how to take it off, but I asked if it would be OK to see some of the other environments beforehand. He graciously let me (no one was waiting for a demo Wednesday morning and we’re well beyond the crush of launch day) so I pulled up Haleakalā, Joshua Tree, and Yosemite.
Haleakalā is visually very impressive, but interestingly didn’t make me feel like I felt when I was at Haleakalā many years ago. I had a similar experience with the Yosemite one, but that might have been more seasonal. There was snow in parts of Yosemite on my trip to it, but it wasn’t snow-covered like this. I just didn’t associate it at all with my memory of the place.
Joshua Tree was the most evocative of the real place, and I’ve been there many times, but I couldn’t really get my bearings. It felt like somewhere in the park, for sure, but one of the things I always think about is the road cutting through the park that’s always very visible (joshua trees do not have lush, view-blocking canopies). It rang the most true so if I were to pick one to use a minimalist markdown editor in, it would be Joshua Tree.
I took the Vision Pro off like how I had been instructed to put it on, and Eric asked if I was considering buying one or if I was just looking. He didn’t do it in a car salesman kind of way, but the way in which he needed to know if he had to have someone fetch one from the back. I declined, and he offered me the QR code that had the configuration from my demo in case I ever wanted to come back and do another demo, or buy a Vision Pro later. I accepted this final exchange in the QR code ballet and thanked him once more before I headed out.
It’s a very interesting product, and I’m glad that I experienced it first-hand instead of just speculating endlessly about why it wasn’t for me. I don’t envision myself ever purchasing this iteration of the product, but I don’t think anyone’s a fool for buying one if they have the disposable income. Perhaps if Apple adds just one more step in the Apple Store process that uses a QR code I’ll reconsider.
Category: text