by Daniel Parker, RDPFS Contributor:
A previous issue of this Bulletin covered how Apple’s iOS 18 and other upcoming releases will bring significantly more artificial intelligence (AI) functionality to the iPhone, iPad, Mac, and other devices. This follow-up explores three non-AI accessibility improvements in iOS 18 tested with the public beta. First is the ability to use Personal Voice with the VoiceOver screen reader. This feature, originally developed for those with speech loss, allows the device to create a voice that sounds like the user. You read aloud 150 phrases and the device processes the recordings, usually overnight. Apple estimates recording time at 15 minutes, though it may take longer. Be sure to turn VoiceOver speech off before recording each phrase. A second feature is the ability to use VoiceOver’s virtual Braille Screen Input feature, not only to type on your phone, but also to navigate it, replacing the traditional gestures. The new feature, called Command Mode, mirrors the longstanding ability to navigate the iPhone entirely with a Braille display. This is accomplished by using a set of chord commands, which are combinations of dots typed simultaneously with the space bar; Command Mode on the screen works as if the user is holding the space bar down on the braille display. For example, typing dot-3 moves VoiceOver focus to the previous item, and dot-6 to the next. This is the equivalent of swiping left and right with one finger, respectively. To double tap, type dots-36. The full list of commands is available here. VoiceOver on the iPhone also has a tutorial, similar to what has been on the Mac for years. It covers all of the most common gestures, including single and double-tapping, the “magic tap”, and rotor gestures, and the three- and four-finger gestures that control speech output, the Screen Curtain, and VoiceOver help. More information will be available on these features once iOS 18 is released officially in September.