XReal’s booth was pretty busy when I got there and I had an appointment for their spatial demo. Utilizing the Xreal Air 2 Ultras, I was able to manipulate various items on the large projected display with my fingers.
One of the demos had me choosing between a few videos and then the rep expanded the scene to a large display with 3D effects happening around and in front of it. I picked The Last Jedi and around the screen were 3D rendered Star Wars ships flying around. In front and around the screen in 3D space were little bits of salt falling around as if you were in Crait.
Most of the actions I performed were done by looking at the item I wanted to choose and using my left hand and doing a pinch motion to select it. That seems to be the common way to select things. But, there was one demonstration that not only made use of two hands but of depth sensing as well.
A panel was presented to me to create an AI character and there were buttons on the panel. Rather than doing a pinch gesture to select it, I reached over and virtually pressed a button with my finger. The glasses were able to sense the distance my hand traveled and made it push the button in my virtual environment. Once a character was created, I was able to use both hands to do a pinch and expand gesture to make the character bigger and bring my hands together to shrink it. It reminded me of the hand gestures that’s capable on a Quest 3, but here we’re doing it all using a smaller form factor in the Air 2 Ultras. Pretty nifty!
But what I was really interested in looking at were the Xreal One Pros. I haven’t picked up any new Xreal glasses as the only one I purchased and reviewed were the original Nreal Air, but the feature list of the Xreal One Pro seems really impressive.
The big thing for me wasn’t the newer Sony micro-OLED panels, although the picture quality was impressive on these. It was the 57 degree FOV, which doesn’t sound like much of an increase over the Air, but in reality it is. It’s getting really close to where I’m not annoyed by the virtual screen being cut off when I move my head around, but I am definitely more interested in using the anchor option of the screen and moving my head around to view it. I might still prefer smooth follow over anchoring, but the larger FOV is making it a much tougher choice for me.
With the onboard X1 chip, the Xreal One line’s capable of many functions that before needed some external product like a Beam Pro to do. You can now adjust the size of the image and do 3DoF features with any connected device. At the booth, I used a Steam Deck with the Xreal One Pro and I was delighted to see I can grow or shrink the screen as well as anchor it in space or have it follow me slowly as I turned my head around. That adds such a nice group of quality of life improvements and it’s all built into the glasses.
The X1 chip does help when turning your head quickly at reducing the blurriness that happens. I was told that the X1 chip has gotten the response time down to about 3ms, which is really solid. Turning your head doesn’t produce as big of a jarring effect with a blurry screen and it’s a lot more comfortable experience with the help of the chip.
I can’t remember if the new optics engine reduced the reflections that can happen in the bottom part of birdbath glasses, but I do really love the larger FOV it provides. It’s a nice step up from what most AR glasses are right now and I think, with my limited experience at their booth, the Xreal One Pro succeeds in creating a better visual experience with the new optics.
As someone who hasn’t experienced many Xreal products since the initial offering, I was really impressed with the Xreal One Pros. They seem to be a solid next generation product in the line of AR glasses and I think this could be my next daily driver for wearable media consumption. I’ve always thought Xreal had good hardware. The Xreal One Pro shows that and I’m pretty excited for this one to come out in March.