Views: 6
Thank you for reading this post, don't forget to subscribe!
In her childhood bedroom, Maxine Collard had a PC connected to a cathode-ray tube monitor so massive it bowed her desk into a smile that grew deeper every year. Collard has oculocutaneous albinism, which means that her hair is naturally bleach white, her complexion maximally fair, and she has uncorrectably low visual acuity with limited depth perception. In order to see the screen, she had to crane her neck until her face was two inches from the monitor.
When Collard was in middle school, her mother bought an iMac for the family. Collard spent hours messing around on the new machine, her nose pressed almost to the glass. One day, deep in the computer’s accessibility settings, she discovered that if she held down the control key while spinning the mouse’s scroll wheel, she could instantaneously zoom the entire screen to whatever magnification level she wanted. There was a rudimentary magnifier app on her Windows computer, but she found the interface difficult to use, and the low-res image on the zoomed-in PC screen, she said, was pixelated, hard to read, “disgusting.” Her experience on the iMac, which allowed her to magnify the entire screen into a much clearer image, came as a revelation.
Earlier this year, Collard had a similar aha moment when she tried the Apple Vision Pro for the first time. Some critics of the AVP were skeptical of a device that pressed two high-resolution micro-OLED screens within millimeters of one’s eyes for hours at a time. But to Collard, the ability to (as she put it) “strap an iPad to my face” was instantly appealing.
Collard is now in her sixth year of a combined Ph.D.-M.D. neuroscience degree at UC San Francisco. When I visited her at her lab in late May, she showed me her workspace: a standing cubicle in a small hive of carrels she shared with her colleagues, a pair of 27-inch monitors on her desk. Zooming her entire screen has its liabilities in a social setting like this: One day she was reading her DMs on Slack, magnified so much that the words were two inches tall. A co-worker sent her a spicy message, something she would have preferred to keep private, or at least in 11-point type, but instead it was broadcast for all her colleagues to see.
After she got an AVP, she had unprecedented control over her visual environment. She took her lab’s Slack channels and enlarged them to the size of a refrigerator, and set them off to her right. Then she opened her code editor and set it in front of her — inches from her eyes, like usual, but five times the size of her external monitors, and her posture was ramrod straight — no more craning. Finally, she opened a browser window, stretched it to the size of a door frame, and loaded the documentation for a tricky data-analysis function she could never remember, and set it off to her left.
Collard has strabismus — her eyes don’t align the way typical eyes do — which would confuse most eye-tracking algorithms, but in the AVP’s accessibility menu, she turned on “single-eye tracking,” so the device wouldn’t get confused by eyes that point in different directions. The device can lessen the effects of her nystagmus — involuntary eye “wiggles” that have confounded eye-tracking devices she’s used in the past.
The AVP has a range of accessibility features for other disabilities as well: Blind users can use VoiceOver, a screen reader that will speak text, using a custom set of hand gestures to navigate through apps. People with mobility disabilities can make selections through a variety of alternative methods: with their voice, or using a switch or joystick (easier for some users with motor disabilities), or with a feature called Dwell Control, which allows a user to make a selection simply by “dwelling” their gaze on an item. With sound actions, a user can make a selection with a custom noise (like a cluck or a pop). In lieu of eye gaze, the pointer can be controlled with one’s head, wrist, or finger, and most of the accessibility features users are familiar with from other Apple products — reduced motion, color filters for color-blindness, and hearing-device support — are included.
Because of her reliance on large monitors, Collard could never comfortably join her colleagues to debug code in a coffee shop or in the shared kitchen one level down from their sixth-floor lab. That’s all changed with the AVP. “As a disabled person,” she wrote in a blog post, “the ability to finally sit back with my feet up on a bench out in the sun while working on my laptop — or more accurately, while working on a 30-foot-wide 4K screen floating in exactly the perfect ergonomic position, one that I can reposition anywhere I want it to be in any moment — is the answer to decades of prayers to the accessibility gods.”
Mission Bay was warm and breezy in late May, and Collard led me down to Koret Quad, where she now loved to sit and work. Inside her headset, a code editor the size of a garden shed floated above the grass. As she worked, she saw the window begin to shimmer and a shadowy figure troubled the lines of code. Then a man, smiling and looking right at her, strode through the window of her workspace and stopped. This sort of thing happens to Collard whenever she takes her AVP out in public — she has caught numerous people taking surreptitious selfies with her in the frame.
“Hi there,” she said preemptively to the smiling man, who was clearly drawn by the novelty of seeing an Apple Vision Pro en plein air. He looked at Collard and said, in a lightly mocking tone, “How’s that working out for you?”
Collard has struggled with her identity as a disabled person, resisting alien-seeming assistive tools like the monocular lenses that low-vision specialists tried to get her to use in school. But she sees the AVP as a liberatory device, and no arch tech skeptic on the quad could dampen that feeling. She fixed his gaze with her digital SeeThrough avatar eyes and answered him with emphatic cheer: “Really great, in fact!”
To read the rest of the New York Magazine article, click here.