Apple Vision Pro Features Virtual Typing, Hand Gesture And Eye Movement UI Navigation
Apple’s Vision Pro AR/VR headset was finally announced at the WWDC 2023 event on June 5, answering many of the questions that we’d had during the years of rumors and leaks.
One of those questions was about how the headset would be controlled and whether there would be any handheld controllers.
Now that Apple has announced the headset we know that, unlike other headsets that offer AR and VR capabilities, the Vision Pro won’t require that users have handheld controllers at all Instead, it’s all about using hand gestures and your eyes.
Apple’s demonstrations showed that its headset will use the way the wearer’s eyes focus on an item as a way to select it, while cameras will track hand movement and allow people to simply interact with the interface using gestures. Pinching two fingers together will simulate a click, for example, while a similar gesture can be used to expand the size of a window.
There is even support for typing in thin air and those who have been able to test the headset have already spoken about how quickly they found themselves becoming accustomed to using the new interface.
That bodes well, and it will be interesting to see how other people get on with the headset once it is released to the public in early 2024. There is support for connecting an iPhone or keyboard if required, however.