(CNN) – It’s rare to find a new technology that feels groundbreaking. But last night, as I sat on a couch in a private demo room on Apple’s campus wearing its recently announced Vision Pro mixed reality headset, I felt like I’d seen the future — or at least an early, very expensive prototype.
In the 30-minute demo, a virtual butterfly landed on my finger, a dinosaur with detailed scales tried to bite me, and I stood inches from Alicia Keys’ piano as she serenaded me in a recording studio. When a tiny bear cub swam beside me in a calm lake during another immersive video, it felt so real that it reminded me of an experience with a loved one who recently passed away. I couldn’t wipe away the tears inside my headset.
Apple unveiled a mixed-reality headset, its most ambitious and risky new hardware offering in years, at a developer event that same day. The headset combines virtual reality and augmented reality, a technology that overlays live virtual images of the real world. At the event, Apple CEO Tim Cook touted Vision Pro as a “revolutionary product,” with the potential to change how users interact with technology, each other and the world around them. He called it “the first product that you look through, not at.”
But it’s clearly a work in progress. Apps and experiences remain limited; users must remain tethered to an iPhone-sized battery pack with just two hours of charge life; and the first few minutes using the device can be daunting. Apple also plans to charge $3,499 for the device when it goes on sale early next year — more than previously rumored and far more than other headsets on the market that have previously struggled to gain wide adoption.
With its loyal following and impressive track record in hardware, Apple can convince developers, early adopters and some enterprise customers to pay for the device. But if it wants to appeal to a more mainstream audience, it will need a “killer app,” as the industry often calls it — or several.
Based on the testing I was able to do, Apple still has a long way to go, but it’s off to a convincing start.
Hours after the main event, I arrived at a building on Apple’s sprawling campus in Cupertino, California, built specifically to host demonstrations and briefings for the new glasses.
I was greeted by an Apple employee who scanned my face to help customize the fit of the headset. Then I entered a small room where an optometrist asked me if I wore glasses or corrective lenses. I had Lasik surgery years ago, but others around me had their glasses scanned so the headset could feature their specific prescription. It’s an incredible feat that sets Apple apart from the competition and ensures that no frames need to be inserted into the headset. But it’s unclear how they plan to handle this process at scale if millions buy the device.
The initial setup process was somewhat unpleasant: I felt a bit nauseous and claustrophobic as I adjusted to the device. It tracked my eyes, scanned my hands, and mapped the room to best accommodate the augmented reality experience.
But Apple has also taken steps to reduce the motion sickness problem that has plagued other headsets as well. Apple’s mixed reality glasses use an R1 processor, a custom chip that reduces the latency problem found in similar products that can cause nausea.
As many viewers were quick to point out on Monday, the Apple headset itself looks like a pair of designer ski goggles. It features a soft, adjustable strap at the top, a “digital crown” on the back — a larger version of the button you’d find on an Apple Watch — and another at the top that serves as a sort of home button. There’s also a cable that connects to a power bank.
The glasses themselves felt light enough at first, but even with Apple’s considerable design chops, I never quite got over the idea that I had a computer on my face. Thankfully, unlike some other computing products, the glasses stayed cool on my face throughout the experience, thanks in large part to a quiet fan and the airflow that runs through the system.
Unlike other headsets, Apple’s new mixed reality hardware also displays its users’ eyes on the outside, so “you’re never isolated from the people around you — you can see them and they can see you,” Alan Dye, vice president of human interface, said during the keynote address.
Unfortunately, I never got to see what my own or anyone else’s eyes looked like through the headset during the demo.
After placing the device down, I was greeted with an iOS-like interface. I could easily move in and out of apps, such as Messages, FaceTime, Safari, and Photos, using only my eye movements and tapping my thumb and index finger to act as the “select” button. This was more intuitive than expected, and worked even when my hands were resting on my lap.
Some app experiences were better than others, though. It was lovely to see images in the Photos app laid out before me in a larger-than-life way, but it’s hard to imagine feeling the need to do this often on a couch at home. Vision Pro also offers a spatial photography option, which lets users view images and videos in 3D so you feel like you’re right in the scene. Again, cool but unnecessary.
During another demo, an Apple employee wearing a Vision Pro headset FaceTimed me from across campus. Her “persona” — a digital representation that didn’t show her wearing the Vision Pro — appeared in front of me as we talked about the event earlier in the day. She looked real, but it was clear she wasn’t; she was some sort of pseudo-human. (Apple didn’t scan my face to create my own persona, which would otherwise be done through its OpticID security feature during the setup phase.)
The Apple employee then shared a virtual whiteboard — dragging, dropping and highlighting interior design images. Cook has focused on AR’s potential to foster collaboration, and it’s clear how this tool could be used in meetings to deliver on that promise. What’s less clear is why most employers would spend $3,499 per device per employee to make this happen rather than just use Zoom.
Like so much else about the product unveiling, this launch felt ill-timed. Early in the pandemic, more people might have jumped at the chance to create these virtual experiences as we worked and socialized almost entirely from home. Now, with more employees back in the office and companies looking to cut costs amid broader economic uncertainty, the rationale for this expensive device seemed less clear.
The real magic of the Vision Pro, however, is in the immersive videos. Watching an underwater scene from Avatar 2 in 3D, for example, was surreal and seemingly placed me right in the ocean with these fictional creatures. It’s easy to imagine Hollywood filmmakers’ embrace of creating experiences just for the headset.
Apple is also uniquely positioned here to power the device with these experiences. It has close relationships in the entertainment industry, including with former Apple board member and Disney CEO Bob Iger, who announced in a pre-recorded video during the event that Disney+ will be available on the headset. Apple also previewed new National Geographic, Marvel, and ESPN experiences for the headset.
Nearly every new Apple product, from the iPhone to the Apple Watch, promises to use different-sized displays to change the way we live, work, and interact with the world. Vision Pro has the potential to do all of that in an even more striking way. But unlike the first time I picked up an iPhone or smartwatch, after 30 minutes of using Vision Pro, I was pretty happy to ditch the glasses and get back out into the real world.