Apple introduces new accessibility features for iPhone, iPad and Mac

In order to be able to use the iPhone, iPad and Mac with restrictions, there are already numerous operating aids under iOS, iPadOS and macOS. In addition to these, the manufacturer Apple has presented a whole collection of new aids aimed at users with cognitive, visual, auditory or motor impairments. Furthermore, tools were presented that are aimed at people who cannot speak or who are expected to lose their ability to speak. The newly introduced accessibility features for iPhone, Mac and iPad work directly on the devices and are enhanced by local machine learning.

Apple has introduced numerous new accessibility features for iPhone, iPad and Mac. Here you will find details about the new functions as well as information about the supporting program in various Apple programs.
Apple has introduced numerous new accessibility features for iPhone, iPad and Mac. Here you will find details about the new functions as well as information about the supporting program in various Apple programs.

The new accessibility features for iPhone and iPad

"At Apple, we've always believed that the best technology is technology that's made for everyone"Says Tim Cook, Apple's CEO. "Today we're excited to announce new features that build on our long history of making technology accessible so everyone has the power to create, communicate and do what they love“, you can in the press release read on the topic. This lists the individual new functions and tools, including explanations and screenshots. These are the innovations:

  • Assistive Access
  • live speech
  • personal voice
  • Point and Speak for the Camera magnifier detection mode
  • Pairing MFi hearing aids to Mac
  • New phonetics functions for voice control
  • switch control
  • Easier to adjust text size
  • Turning off animations for people who react to fast movements
  • VoiceOver voice feedback now sounds natural even at higher speeds
  • "Remember This" as a shortcut for a visual journal in the notes
  • And some more! (see press release)

"Accessibility is part of everything we do at Apple' said Sarah Herrlinger, senior director of global accessibility policy and initiatives at Apple. "These groundbreaking features have been developed with feedback from communities of people with disabilities to support a wide range of users and help people connect in new ways."

Assistive Access for reduction to the essentials

The new aid is for people with cognitive disabilities Assistive Access thought. This reduces the display of some apps and their content to essential functions. As an example, Apple showed the Photos app, whose icon and name can be found at the top of the display. Below are photos arranged in a large grid. A large back button leads to the folder overview. There are also thumbnails for the camera, calls, the music app and messages.

These are Live Speech and Personal Voice

live speech is a function that converts written content into speech on Mac, iPhone and iPad. This allows people with speech impairments or no speech ability to let the individual devices speak for themselves during phone calls and FaceTime calls. Frequent phrases can be saved so they don't have to be re-entered over and over again. This also allows quick reactions in quick group discussions.

personal voice on the other hand, is a language function that is trained with your own voice. In this way, one's own voice can be used if the ability to speak - for example due to an illness with ALS (amyotrophic lateral sclerosis) or for another reason - could possibly be lost. On the iPad and iPhone, it should be sufficient to speak predefined text prompts for 15 minutes to create the personal voice through local machine learning.

"At the end of the day, the most important thing is being able to communicate with friends and family' said Philip Green, board member and ALS advocate at nonprofit Team Gleason, who has experienced significant changes in his voice since his ALS diagnosis in 2018. "Being able to tell them you love them in a voice that sounds like yours makes all the difference — and being able to create your artificial voice in just 15 minutes on iPhone is extraordinary."

Point and Speak for the Magnifier app's detection mode

If the iPhone's camera magnifier is used, a number of detection options can already be used. These should go with you Point and Speak be extended in such a way that recognized texts are selected by pointing and read out by the device. If you hold the iPhone in front of a microwave oven with labeled buttons, the individual elements and their labels are recognized. If you then place your finger on a key, its label is read out. Here the camera, the LiDAR scanner and the local machine learning work together.

New Mac accessibility features and more information

Apple lists in the press release on the subject even more new accessibility features on the various devices. For the Mac, for example, there is the coupling with MFi-certified hearing aids. System sounds, calls, music, films, series, podcasts and more can be transferred directly from the Mac to the hearing aids. 

In Voice control will also be expanded in the future to include a word selection for terms that sound phonetically similar. This means that users who enter text with their voice do not have to try to pronounce it too clearly. As examples, Apple uses English terms in the press release - such as "do", "due" and "dew". In the following screenshot you can also see similar-sounding words like "peak", "peaked", "peace", "pea", "beak", "pick" and so on.

It will also be easier to use the system settings and the accessibility area there font size in some Apple-owned apps. In the case of visual impairments, larger text for calendars, Finder, mail, messages and notes can be defined – individual text enlargements can be used app-specifically. 

Timetable mentioned in the press release

The fact that Apple itself uses English terms for the new accessibility features in the German press release and also uses English-language screenshots shows that the integration of these new functions will take some time. They may not start until June with the WWDC23 unrolled. A precise schedule is not presented. It is also possible that the new functions will not be available until autumn 2023 iOS 17, iPadOS 17 and macOS 14 appear. Apple could announce them now to save time for other content at the WWDC keynote; for example for the mixed reality headset and the new xrOS.

Despite the lack of information on the release of the new accessibility tools, there is a timetable in the notification. Because the new efforts for accessibility under iOS, iPadOS and macOS are accompanied by a service-spanning framework program. Sign language will start tomorrow, May 18, 2023, in Germany, Italy, Spain and South Korea in the Apple Store and at Apple Support. There will also be information events about the new functions in selected Apple Stores worldwide. The Apps Podcasts, Apple TV, Books and Music get collections and playlists on selected topics as well as supporting additional content / functions.

"This week in Apple Fitness + integrates trainer Jamie-Ray Hartshorne ASL and highlights all the features available to users as part of the ongoing effort to make fitness more accessible for all. Features include audio cues that provide additional short, descriptive verbal cues to assist blind or low-vision users [...]", it is also said about Apple's training offer. All further information, additional accessibility functions and more events can be found in the linked Apple press release.

Did you like the article and did the instructions on the blog help you? Then I would be happy if you the blog via a Steady Membership would support.

Post a comment

Your e-mail address will not be published. Required fields are marked with * marked

In the Sir Apfelot Blog you will find advice, instructions and reviews on Apple products such as the iPhone, iPad, Apple Watch, AirPods, iMac, Mac Pro, Mac Mini and Mac Studio.

Specials