One of the most innovative features of Apple Vision Pro is its control system gesture-based, which allows users to interact with digital content using hands, eyes and voice. This means that you don’t need to use any external device to browse apps, watch movies, take photos and more.
Apple Vision Pro, hands and eyes are enough for controls
Apple’s presentation at WWDC 2023 and first impressions from reviewers proved that the new controls work for Vision Pro. To select an option on the display, just look at the desired item and then tap your fingers together. To scroll through a web page or list, just do a quick movement with the hand. To dictate a text or use Siri, that’s it look at the microphone button and then speak.
Gestures are recognized by a series of cameras and sensors that are located on the headset, who are able to understand the position and movement of the hands and eyes. THE gestures are also customizable, in order to adapt to user preferences.
Early reviewers said the Vision Pro’s control system takes some practice to get used to, but that once learned it is very intuitive and natural. Furthermore, the app interface is similar to that of iPhone and iPad, so those who have already used these devices will be at ease.
To type text, you can use a virtual keyboard that appears on the display, or a Bluetooth keyboard or a connected iPhone. Alternatively, voice dictation can be used, which works well even in noisy environments.
Vision Pro wants to revolutionize the way to use augmented reality, without the need for other accessories. It is a technology that, if successful, will change the way people work, play and communicate. Although it probably won’t happen with this first product, designed especially for developers.
Leave a Reply
View Comments