Apple Vision Pro Supports Virtual Typing, Navigation Through Hand Gestures and Eye Movements
Rumors about Apple’s Vision Pro headset have been circulating for years, but something that was never quite clear was how it would be controlled. There were some early rumors of control devices, which eventually faded away as leaks focused more on gesture control, and as it turns out, that’s the way Apple went.
The Apple Vision Pro uses hand gestures, eye movements, and spoken commands for navigation. You can choose something on the display by looking at it, for example, and then select it by tapping your fingers together. Scrolling is done with a quick flick, and only small movements are required. You can look at the microphone button in a search field and then start speaking in order to dictate text, and Siri voice commands can be used to open and close apps, play songs, and more.
Reviewers have said that the Vision Pro navigation experience takes time to get used to, so there will be an adjustment period. Most other headsets on the market use some kind of handheld control mechanism, so the gesture-based control system will be unfamiliar to almost everyone.
Typing can be done with a connected iPhone or Bluetooth keyboard, but there’s also a virtual keyboard to type on, and dictation can be used as an alternative as well.
The interface won’t be easy to get used to for many, but on the plus side, the app layout and navigation will be immediately recognizable to those who have used an iPhone or an iPad. Apps are arranged in a “Home View” that’s similar to the Home Screen so not everything will be unfamiliar.
This article, “Apple Vision Pro Supports Virtual Typing, Navigation Through Hand Gestures and Eye Movements” first appeared on MacRumors.com
Discuss this article in our forums