Here's what a quadriplegic thinks of Apple accessibility in iOS 15
Apple has been working hard to extend its accessibility offerings to users who struggle with various disabilities, and iOS 15 (along with WatchOS 8) brought a new array of functionalities to facilitate such individuals' experience with Apple tech.
Colin Hughes, a quadriplegic as well as vocal advocate for people with disabilities, has long been an avid user of Apple tech big and small, including Apple's HomeKit—which allows him to enter and move throughout his house simply by talking to Siri.
With iOS 15 publicly released, Hughes has spoken out (as reported by 9to5Mac) about his opinion on the usefulness (or lack thereof) of some of the updated operating system's accessibility features for people who are unable to physically interact with a touch screen:
It is a bit unnerving to have auto-answer calls kick in when the Watch is off my wrist and in a bedroom. It has privacy implications for people who forget to turn off auto answer when off wrist and charging...
Unless I had Auto-Answer on (which unfortunately still requires me to remember to ask my carer to switch it on for me) I was never able to answer calls. This really increases independence for people like me.
Similarly, it has been a joy to have notifications from with third-party apps like Facebook Messenger and WhatsApp, read out to me while wearing Airpods for the first time with Announce Notifications in iOS 15. As someone who can’t pick up and open my iPhone to read messages and notifications this new functionality makes me feel really connected like never before.
I’ve dealt with important Outlook emails, WhatsApp messages, and more besides hands-free, responding and actioning important things promptly, with only what I’m hearing in my ears through the AirPods, and this is really liberating and productive.
Auto-Answer
As you will no doubt know, a significant part of my advocacy over the past 3 years has been calling on Apple to introduce Auto-Answer calls to the Apple Watch. At last, in watch OS 8, it has arrived!
This brings a level of convenience, security and accessibility that is so important to people like me with severe upper limb disabilities. It was a special moment when I first had the chance to try it.
Turned on the functionality for the first time and received a phone call on my wrist that was clear to me, and the caller, and – as my disability means I can’t touch the Watch screen – I didn’t need to do anything to handle the call effectively.
This brings a level of convenience, security and accessibility that is so important to people like me with severe upper limb disabilities. It was a special moment when I first had the chance to try it.
Hughes' Apple gadget setup, as modern as it gets, includes an iPhone 13 Pro, an Apple Watch Series 7, and AirPods 3 which he uses on the daily for productivity and interaction with the world. As a dedicated user, he is able to provide both valid input and critique on behalf of other individuals with disabilities that render touch screens difficult—and he does leave some critique:
It is beyond ironic that Auto-Answer, a feature designed for people who can’t touch the screen, still requires you to touch the screen to toggle it on and off.
Please can users toggle Auto-Answer on and off by Siri voice commands, “hey Siri, turn on/off Auto-Answer”, and by setting up Siri Shortcuts. For example, turn on Auto-Answer when I leave home, at a certain time, when I get out of bed in the morning, and so on.
Please can users toggle Auto-Answer on and off by Siri voice commands, “hey Siri, turn on/off Auto-Answer”, and by setting up Siri Shortcuts. For example, turn on Auto-Answer when I leave home, at a certain time, when I get out of bed in the morning, and so on.
Hughes also points out that the Apple Watch should automatically disable Auto-Answer calls when it's off your wrist—the alternative poses a serious privacy issue.
I can’t overstate how massive this has been for me. Every day, when I am out and about in the city, I am answering calls, sometimes really important calls, effortlessly, hands-free with just “Answer”.
Unless I had Auto-Answer on (which unfortunately still requires me to remember to ask my carer to switch it on for me) I was never able to answer calls. This really increases independence for people like me.
Hughes adds that one more complaint he had is that you can't ask Siri to hang up a call for you yet—but he feels sure that Apple is listening and more solutions are coming, he says.
Announce Notifications
iOS 15's new Announce Notifications feature allows users who are driving, busy, or otherwise cannot check notifications, to have them read out loud to them through their Beats or AirPods wireless earbuds. Hughes finds this a hugely positive change:
Similarly, it has been a joy to have notifications from with third-party apps like Facebook Messenger and WhatsApp, read out to me while wearing Airpods for the first time with Announce Notifications in iOS 15. As someone who can’t pick up and open my iPhone to read messages and notifications this new functionality makes me feel really connected like never before.
Speech Recognition and Voice Control need a lot of work
Unfortunately, it's still near impossible to dictate long text with Voice Control, and Speech Recognition has a very long way to go before it effectively improves users' productivity, says Hughes. Perhaps iOS 15 could have done much better with that—maybe next time.
Speech recognition on desktops and laptops generally, both Apple and Windows, is in a bad place at the moment. My productivity is hanging by a thread thanks only to Dragon Professional, the Firefox browser, and being able to run Dragon with Parallels on my Mac.
Sadly, Voice Control has hardly improved this year and remains only good for dictating short, (often error strewn), sentences or two: you couldn’t write a 1000 word blog article, run a business, or write a dissertation with its dictation capabilities. It would take you hours of frustration compared to Dragon.
As an Apple user I am looking enviously at what Google is doing on the Pixel 6 at the moment with the Tensor chip. That’s the kind of sophistication I would like to see Apple provide users with severe physical disabilities who rely on speech recognition on the Mac for work, education and keeping in touch.
I believe access to technology and communication is a human right and speech recognition is my only means of access to communicate with the world, and do grown up things that go much further than dictating “happy birthday” with a heart emoji. Disabled people who rely on voice access deserve better than that.
Sadly, Voice Control has hardly improved this year and remains only good for dictating short, (often error strewn), sentences or two: you couldn’t write a 1000 word blog article, run a business, or write a dissertation with its dictation capabilities. It would take you hours of frustration compared to Dragon.
As an Apple user I am looking enviously at what Google is doing on the Pixel 6 at the moment with the Tensor chip. That’s the kind of sophistication I would like to see Apple provide users with severe physical disabilities who rely on speech recognition on the Mac for work, education and keeping in touch.
I believe access to technology and communication is a human right and speech recognition is my only means of access to communicate with the world, and do grown up things that go much further than dictating “happy birthday” with a heart emoji. Disabled people who rely on voice access deserve better than that.
Assistive Touch leaves much to be desired
Hughes says that for him, Assistive Touch (which was introduced in May) is basically useless for him, and may prove the same for most people with similar disabilities. It simply requires too fine of movements to work, even while supposedly being designed for people with mobility difficulties:
I am unable to activate it with the limited muscle power in my lower arms and hands. Apparently I just don’t have enough physical movement in my arms and hands to trigger the accessibility feature. It’s made me question who Apple designed this for because on paper it should be tailor-made for people like me, I am not completely paralysed, and have just enough lower arm and hand movement to wake the screen and clench my fist but apparently this is not enough to make use of AssistiveTouch.
I would imagine a lot of people with upper limb disabilities won’t be able to make use of this technology. I am sure there are ways that it could be tweaked and improved to extend access
iOS 15 did remove some seemingly useful accessibility features for the visually impaired in September, and Colin Hughes hasn't commented yet about whether this has left him wore for wear. In particular, iOS 15 did away with allowing Siri to send e-mails for you (something Burke uses regularly), and the following Siri commands no longer exist:
- Do I have any voicemails?
- Play my voicemail messages
- Check my call history
- Check my recent calls
- Who called me?
- Send an email
- Send an email to [person]
Do you happen to use any of Apple's accessibility features yourself? If so, how would you say Apple could improve (or has improved) to make your life easier in that regard? We'd love to hear some of your thoughts in the comments below.
Things that are NOT allowed: