How to Control the Apple Vision Pro with Your Eyes and Hands

XR1yrs agorelease XR-GPT
10,647 0

Apple VisionPro 是通过眼球追踪和手势控制的,但究竟是如何控制的呢?

at WWDC23 developer talkApple 设计工程师解释了您的眼睛和双手如何协同工作来控制 Vision Pro 的 visionOS。

Use your eyes to make selections and expand menus.

In visionOS, your eyes are the targeting system, just like moving a mouse or hovering your finger over a touchscreen.

User interface elements will respond to viewing them to indicate that they have been selected. In addition, viewing the menu bar expands the bar and viewing the microphone icon immediately triggers voice input.

Pinching your index finger and thumb together is the equivalent of clicking a mouse or pressing a touch screen. This sends the "click" to wherever your eyes are looking.

Click, scroll, zoom and rotate.

This is obvious, but how do you perform other critical tasks such as scrolling?

To scroll, you need to pinch your fingers and flick your wrist up or down. To zoom in or out, you pinch your hands together and move them outward. To rotate something, you do the same thing but up and down.

All of these gestures are guided by where you are looking, so you can precisely control the user interface without having to hold your hand in the air or use a laser pointer controller.

Zoom in and out of images using eye tracking and gestures.

It's a very intuitive way to interact with AR/VR, and Vision Pro combines the TrueDepth hand-tracking sensor suite with precise eye-tracking to make it possible. Theoretically Quest Pro 本可以采用类似的方法,但基于摄像头的手部追踪可能不够可靠,并且 Meta 可能不想花时间在无法正常工作的交互系统上 Quest 2 and 3, which both lack eye-tracking.

Those who tried Vision Pro reported that this combination of eye selection and gestures made the interaction system more intuitive than any headset they'd tried before. In our hands-on experience, "it's great to see Apple do it so right. It truly embodies Apple's "it just works" philosophy.

However, some tasks are better accomplished by using your hands directly. For example, you can type text on visionOS while using a virtual keyboard. Other scenarios in which Apple recommends using "direct touch" include examining and manipulating small 3D objects and reconstructing real-world interactions.

Meta is also developing the Quest system software in the direction of direct contact with it. But it's using it for all interactions, mimicking a touchscreen, rather than just for some specific tasks. It doesn't require eye-tracking, but it does require holding your hand in the air, which can get tiring over time.

© Copyright notes

Related posts

No comments

none
No comments...