The Accessibility menu is where you’ll find the features Apple added to iOS to help people with hearing and vision problems use an iPhone. From this menu you can add hearing aid compatibility, activate the sound recognition feature that sends an alert when certain sounds like a fire or smoke alarm go off. Other sounds that will trigger a notification include the sound of a broken glass, the sound of a crying baby, and more.
Other features found in Accessibility include Closed Captions and Closed Captions and there is a listing for Siri. However, some of the things Siri was doing to help the visually impaired or blind were removed following the iOS 15 update. Apple’s digital assistant will no longer send emails. It will also not respond to the following commands:
- Do I have voicemail messages?
- Listen to my voicemail messages
- Consult my call history
- Check my recent calls
- Who called me?
- Send an email
- Send an email to [person]
AppleVis, an online forum for blind or visually impaired iPhone users, recently posted a comment from a user named Brian Negus who wrote: “I tried on my iPhone SE with iOS 15.0 and Siri’s response was “Sorry, I can that doesn’t help me.” Subsequently, I had the same response on a phone running iOS 12. If this is a deliberate removal of features, it is certainly a loss for some visually impaired users who find that this is a way. convenient to send a brief email.
Negus later revealed that he had heard from Apple Support and that they told him that “we are aware of this issue and that it is currently under review.” This could indicate that Apple will send a software update to exterminate the bug. It certainly makes more sense than Apple randomly deciding not to allow Siri to help people with vision problems.
Interestingly enough, while the first idea is that the problem started with recent iOS 15 updates, the same accessibility features that no longer work with Siri in iOS 15 also don’t work while in use. of Siri in iOS 14. Siri will play the last voicemail that a user received or read a voicemail message from a specific person. But again, the digital assistant will no longer read the list of all available voicemail messages.
Siri received a sort of overhaul in iOS 15, with speech processing now done on the device rather than through an Apple server. This is because the Neural Engine will process speech as well as the off-device server. The digital assistant will handle some offline requests, including setting up alarms and timers, messaging, launching an app, controlling audio playback, and opening the Settings app.
Siri will now share onscreen content including photos, web pages, content from Apple Music or Apple Podcasts, map locations, stories from Apple News, and more. All you have to say is “Send this to Giancarlo”. If the content on the screen cannot be shared, Siri will offer to send a screenshot instead. And while Google Assistant already does this, Siri will now maintain context when you have a follow-up question to make the digital assistant interaction more of a conversation.
Is it possible that somehow, in the midst of these changes to Siri in iOS 15, some of the accessibility features were accidentally disabled? No matter what caused this, the most important thing is that Apple recovers the lost capabilities of Siri so that those less fortunate can still use their iPhone even if their vision is poor.
Apple already has a backlog of fixes that it needs to send out so we can see all of these bundled together in iOS 15.01, or added to the iOS 15.1 update that Apple is currently testing with the release of its second Developer Beta. .