Analyzing Apple Vision Pro's Role as an Accessibility Engineer

XR9mos agorelease XR-GPT
4,184 0

作为无障碍工程师,分析苹果Vision Pro

苹果的无障碍菜单在visionOS中

February 2, 2024 will be a day in history - it's not clear if it's because of a disruptive paradigm shift in computing technology or because of one of the most expensive failed products of all time.

在我讲述使用苹果VisionProexperience, I thought it would be worthwhile to have a brief discussion of my experience actually acquiring the device. As a seasoned first-time iPhone buyer, I can say that this experience was completely different in almost every way.

每当我预订新的苹果产品时,我都会在在线商店开放时就下单。通常情况下,只要我迅速完成结账流程,我就可以在马里兰州的本地苹果商店选择一个中午取货时间段,问题不大。但这一次,在经过额外步骤来获取适合尺寸的苹果Vision Pro表带之后,我的支付过程失败了三次,因为我选择的中午取货时间段不再可用。

令我不愉快的是,第四次尝试,我排在了晚上7点的预约,但仍然能在首发日到达。快进到2月2日,我来到苹果商店时,我转过角,期望看到门外有一大群人来免费体验。但却没有任何人,事实上,我从未见过商店如此空无一人。当员工们引导我到一个桌子旁边,准备为我提供个人演示和设备快速培训之前,我问他们整天都是这样的情况吗。根据他的评论,似乎是这样。他们只进行了几次演示,并且销售的设备不到10台。

Now I know things would be very different if I lived in Manhattan or San Francisco, but I do live in a city and therefore present you with information that is more representative of a wider range of people than what you see on Fifth Avenue in New York.

Let's talk about accessibility

I came across a quote from James Cameron describing his demo with Apple's Vision Pro as "a religious experience."

I couldn't agree more. This device is so immersive it will almost transport you to a world where you don't have to sell a kidney to pay for it. All kidding aside, this really is an incredible device that offers a very fluid user experience. The way you interact with the interface feels fresh and exciting, yet familiar and easy to understand at the same time. That said, it's obviously a first generation device.

There are a lot of inefficiencies and issues, and typically this is when accessibility is overlooked in favor of other, more compelling features. But following Apple's style, that's not the case with the Vision Pro. It packs a wealth of accessibility features that are clearly designed to address many of the issues that can be problematic for people with disabilities. When I started using this device, I realized how critical it was that they included these features from the start. I am mildly visually impaired and have only used the only assistive feature on my cell phone in my life: slightly larger text. But from the moment I put on the Vision Pro, I realized I was going to need more help.

I viewed the image through the Apple Vision Pro, and it looked like what a normal person would see if they put on someone else's glasses. The clarity of the images and text was very low and everything was blurry enough to be difficult. As someone who has high hopes because I've been using cameras and digital displays to enhance my vision, I have to admit that, at least for the time being, I'm going to have to deal with this blurriness. I'm used to the world being blurry for me. As a result of this experience, I'm actually better equipped to deal with it than the average person who, for example, spends an entire day wearing their friend's glasses.

Part of my visual impairment is a condition called "nystagmus," which is characterized by rapid, unconscious eye movements. I saw a blind creator on TikTok who also suffers from nystagmus talk about how he was worried about the Apple Vision Pro's eye-tracking feature, so he immediately switched to a setting that allowed the cursor to be controlled by head tracking. My nystagmus isn't too pronounced, so I decided to leave the device as-is to make it work for me, and to my shock, it did, at least most of the time. While it worked, I noticed that my cursor jumped when I tried to focus on smaller or farther away "buttons", so I went into the settings and turned on the secondary function setting, Ignore Rapid Eye Movements, which seems to have been created just for me. Unfortunately, I didn't feel that the addition of this setting changed my experience much and the jumping cursor remained. I plan to try a number of settings and combinations to maximize my field of view in the Apple Vision Pro, but I thought it was worth communicating what I experienced out of the box and what I did with my current vision based on the best information released at the time.

Screws and nuts

Sight impaired or not, as a trained engineer, I would be remiss if I didn't mention something extra that I noticed: it is very easy to drop.

It's impossible to tell from the initial product photos or demos, but the headset is divided into five parts: the device itself, the strap, the eye component, the light seal, and the battery. The eye parts and light seal are attached to the device itself via magnets only. This would be fine if the magnets were strong enough to support the device, but they are not. This means that if you pick up the device the way you would naturally want to (via the eye part), they will separate and the device will almost fall to the ground. When I made this sad mistake at the Apple Store demo, you could tell by the looks on the employees' faces that I wasn't the first. It has to be able to be placed correctly - a much more challenging accessibility issue. I find it hard to believe that the Apple Vision Pro can accurately set itself up. The device must be placed exactly on your face for everything to align properly with your eyes. This means that when you first put it on, the device automatically aligns with your pupil spacing.

Because both of my eyes don't look square, any time one of them turns to my nose, sort of like a lazy eye, it's very challenging. This only involves alignment on one axis - it also has to sit vertically in the exact position of your head or you will again experience a lack of image clarity and I couldn't use the stylish strap they show in the marketing.The Vision Pro comes with a second strap in the box that provides extra support at the top of your head and can also be wrapped tightly around the Head. I'm not sure which factor in the shape of my face causes me to need this extra support, but without it, the device hangs down.

Entering text is a nightmare - I've seen a lot of people talk about this, and I'll go into more depth on one of the reasons below, but typing over the air on the Apple Vision Pro is a huge disaster. A friend of mine spent his first 15 minutes trying to enter a password on my device. Those of you who are visually impaired will understand that this is almost as accurate as typing with your nose on an Apple Watch. I know we all do it. (Editor's note: verified).

Very uncomfortable for the first 10 minutes - You can adjust to the extra head weight fairly quickly, especially when using the straps of a vertical support device. This seems more natural as your body is used to holding your head vertically all day, but it is noticeably uncomfortable for the first few minutes.

I can't drink coffee - the cup touches the device, which makes it unusable.

A higher level of challenge

为了完整起见,值得深入探讨Vision Pro所面临的一些更大的挑战:

Our brains aren't designed to always be staring at what we're doing - and that's probably the biggest fundamental flaw in the whole Apple Vision Pro experience. Because your eyes act as a cursor, you must always keep your eyes on what you're doing. This isn't natural, especially when typing text. Having to focus on each pressed letter so that it's typed correctly is a prime example of the behavior required to earn an F grade in a computer course (in the 21st century).

Does this actually solve a problem? - While the Vision Pro is undoubtedly the coolest piece of technology I've ever interacted with, there's still the question of whether it solves a big enough problem or offers significant enough upgrades to really get noticed. I think the device brings a certain ease and fluidity to collaborative meetings across time zones, but only if it's non-invasive enough that I don't get wrinkles on my face from over-reliance.

call to action

In concluding my exploration of Apple's Vision Pro, the journey has been enlightening in terms of accessibility.

The device not only demonstrates Apple's innovative spirit, but also emphasizes its commitment to inclusivity by making technology accessible to people with disabilities from the start. This commitment is a beacon of hope, demonstrating a future where technology is adapted to the needs of the user, not the other way around. I have similar goals when it comes to exploring technology, which is why I founded my company, ReBokeh. Going into college, the only options for visually impaired college students were a giant digital camera/amplifier or an audio recording device. At the time, I realized how far behind these technologies have fallen in modern times and how much room there is for improvement. We are exploring a future where new technologies can not only be used by people with disabilities, but can add value to our life experiences.

When it comes to Apple Vision Pro, I see the potential of ReBokeh's visual enhancement tool, which is currently available for the iPhone and iPad and enables users to adjust the appearance of the real world to their needs. This technology is now loved and in demand by users in over 100 countries around the world.

There's just one problem. Like Meat before it, Apple has yet to grant developers access to the built-in camera. While this decision is understandable from a privacy and security standpoint, it will hamper the growth of companies like ours and will also limit consumers to using only the visual enhancements that Apple sees fit. In the spirit of competition, or perhaps just to play by the rules, I urge Apple to consider granting access to these cameras to certain companies that can come up with a reasonable use case and agree to abide by privacy and security standards so that we can continue to extend the value of the incredible hardware they've built while continuing to make life a little easier for people with disabilities.

Rebecca Rosenberg is an Accessibility Engineer and ReBokehfounder of a startup dedicated to "enhancing vision, not suppressing it". You can find out more about the company atFind ReBokeh on the Apple App StoreThe iOS version of the. On visionOS, ReBokeh is only accessible through the virtual camera.

source:uploadvr

© Copyright notes

Related posts

No comments

none
No comments...