TouristNet.com Logo ||| TouristNet.com Logo facebook Adong

<

I spent the morning with the Apple Vision Pro








Condividi su Facebook Condividi su Telegram Condividi su Twitter Visita Instagram Condividi su Threads

I spent about an hour with the $3,500 headset and got to experience firsthand what everyone's been talking about.

"Avatar" arrived in theaters in 2009 and was a technological marvel that gave the audience one of the most immersive cinematic experiences in film history. However, if you believe contemporary online forums, it also led some viewers to feel something entirely unexpected: depression.

Not long after the film's release, CNN reported a strange new phenomenon that some called "Avatar Depression." The film was so engaging that some viewers reported feeling a sense of emptiness when they left the theater and exited the world of Pandora.

With the increasing prevalence of extended reality experiences through headsets from companies like Meta, HTC, and Sony, many have experienced similar phenomena. The more immersive the experience, the more jarring it can be when you finally take off the headset.

After all, at their core, these types of headsets effectively try to trick the brain into what it's seeing. This cognitive dissonance is also the cause of motion sickness that some people experience in virtual reality. Your body and brain, in essence, inhabit different realities.

The Vision Pro isn't a virtual reality headset, at least according to Apple. If you follow the company's promotional materials, it's a spatial computing device. In practical terms, it's mixed reality. While many or most of the applications so far have been experienced as augmented reality, thanks to integrated pass-through technology, the device can also become fully immersive with a quick twist of the Apple Watch-style digital crown located on the visor.

This week, Apple is giving selected media members the chance to try the Vision Pro. I spent some time with the headset today. It was my first hands-on experience with the hardware, as Matthew had the honor when it was unveiled at WWDC over the summer. The idea was to explore as many facets as possible in about 60 minutes, from initial face scans for apps to the spatial desktop and movie viewing (no games this time, sadly).

The company has paid a lot of attention to providing both ends of the immersion spectrum with the Vision Pro, ranging from full pass-through to the Environments, a naturalistic, immersive scene that's a bit like stepping into a live, infinite loop photo. An hour spent exploring various apps probably isn't enough to experience "Avatar"-level depression (certainly not in my case), but it does offer a glimpse into a world where such phenomena are a concrete possibility, especially as display resolutions are capable of rendering increasingly lifelike images.

In the case of the Vision Pro, the screen is everything. As cell phones have reached a point where 4K resolutions and 120Hz refresh rates are no longer novelties, headsets have taken their place. Much of the Vision's ability to do what it does depends on the pair of micro-LEDs, which insert 23 million pixels per eye. This has the effect of creating an extremely dense 4K display in front of the user.

Of course, being an Apple product, every aspect of the hardware has been carefully considered. It all starts with the fitting process. Starting on February 2, Apple will have Geniuses available in all of its stores in the United States to walk buyers through the process. The exact nature of the in-store experience has yet to be outlined, but a portion of the store layout will be dedicated to this purpose, rather than trying to fit it all within the confines of the Genius Bar.

Of course, not everyone lives near an Apple Store. So, the company will make the process available through the app as well. In fact, the at-home version is based on the same app that employees will use in-store. The first step is nearly indistinguishable from setting up Face ID on an iPhone. Hold the phone close to your face and then move the phone in a circle as it scans from various angles. You'll do it twice.

From there, the system will determine which components fit your face shape the best. All faces are different, of course. There's a wide range, and getting the wrong fit could significantly impact the experience. We had some trouble with mine (not the first time these words have been spoken). The Light Seal, which magnetically attaches to the visor, is designed to keep ambient light from getting in.

I couldn't get it to fit perfectly. In the end, we ran out of time, and I had to go with light leaking in from the bridge of my nose and cheekbones. If you've ever had a similar experience with a headset, you know that it can be bothersome at first, but your brain eventually adjusts, and you forget it's there. However, there were some darker demos where it was noticeable again.

I've recently read some hands-on reviews reporting some discomfort after wearing the hardware for a full hour. Personally, I didn't experience this sensation, but it's obviously person-dependent. To more comfortably distribute the weight of about a pound of the device, Apple includes a couple of straps in the box. There's the Solo Knit Band, the big, padded one you see in all the photos. Apple also includes the Dual Loop, which is narrower and has a second strap that goes over the head.

I wore the latter during the demo, assuming it would do a better job of weight distribution. The straps attach magnetically and feature Velcro for adjustments. And then, of course, there's the battery pack. My guess is that Apple's designers struggled hard to find a way to avoid it.

Ultimately, though, doing so would have meant a significant loss of battery life or a considerable increase in the weight of the headset. For better or worse, the world of hardware is full of compromises. After all, there are limits to physics. At its current state, the battery pack is a bit of a vestigial organ and not particularly elegant. It looks almost like a first-gen feature to be addressed in subsequent versions.

It's long enough to snake behind you while sitting or to fit in a pocket. I have no doubt that in the coming months, we'll see numerous solutions from third-party accessory makers, like battery belts promising an augmented reality element.

Once you're up and running, though, you'll end up forgetting it's there. This can become a problem in and of itself, if you decide, as I did, to stand up halfway through the demo. I got a slight jolt from the pack when I did. The moral of the story is, if you plan on standing while wearing the headset, find a good spot for the battery.

The user experience is largely gesture-based. You'll do more pinching than an overly enthusiastic St. Patrick's Day reveler in this thing. The secret is a combination of eye tracking and pinching. Look at an icon, and it will pulse subtly. Now you can pinch to select. Pinch your fingers and swipe left or right to scroll. Pinch your fingers on both hands and move them apart to zoom in. There's a bit of a learning curve, but you'll adapt quickly. I believe in you.

Hand tracking is very good here. You don't need to raise your hands (though you probably will instinctively); just make sure they're not obstructed from view. I mostly positioned mine.




imoond.com







htndoc.com



imoond.com


imoond.com