On May 17, Apple announced a list of accessibility features aimed at helping users with disabilities. Apple claims that new features coming to the iPhone, Apple Watch and Mac later this year will use hardware, software and machine learning improvements to help people with visual disabilities, physical or motor disabilities. Features include door detection for iPhone and iPad users, Apple Watch mirroring and live captions. Apple has also announced updates to Voiceover with 20 additional languages and languages.
Door detection is one of the most useful accessibility features introduced by Apple as part of the latest updates, enabling users to navigate to find a door that uses the LiDAR sensor on the latest iPhone or iPad models. The company said that the feature uses a combination of lidar, camera and on-device machine learning to help users understand how far away the door is and explain the door’s attributes.
The door detection feature allows people to open a closed door by pushing it, turning a knob, or pulling on a handle. It claims to read signs and symbols around the door, such as room numbers, and to identify the presence of an accessible access sign.
Door detection feature that works with iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max, iPad Pro 11-inch (2020), iPad Pro 11-inch (2021) and iPad Pro. The 12.9-inch (2020) and iPad Pro 12.9-inch (2021) will be available through the pre-installed Magnifier app.
The Magnifier app will have a new detection mode to enable access to the door detection feature. It will have people detection and image description as two new features that can work alone or simultaneously with door detection to help people with visual impairments or visual impairments.
Along with updates inside the magnifier, users who enable voice over for Apple Maps will receive voice and haptic feedback to help identify the starting point of the walking direction, the company announced.
The Apple Watch also comes with dedicated Apple Watch mirroring support to allow users to remotely control the smartwatch with their paired iPhone. The new offer will allow users to control the Apple Watch using the iPhone’s assistive features, including voice control and switch control. Users can use inputs such as voice commands, voice actions, head tracking, and External Maid for iPhone switches for iPhone switches.
All this will help people with physical and motor disabilities.
Apple says Apple Watch Mirroring uses hardware and software integration, including AirPlay enhancements, to allow users to use features including Blood Oxygen, heart rate tracking, and the Mindfulness app. The mirroring feature will work on the Apple Watch Series 6 and later models.
Apple Watch users will also get double pinch gesture support. It allows users to answer or terminate a phone call, reject notification, take a photo on the Now Playing app, play media, pause, resume exercise, pause or resume – all with a double pinch gesture. It works with the Assistive Touch on the Apple Watch.
Apple has announced live captions for deaf or hard of hearing users on iPhone, iPad and Mac. It will be available in beta in English later this year for iPhone 11 and later users in the US and Canada, the A12 Bionic and later iPad models, and Macs with Apple Silicon.
Live captions will work on any audio content, including phone, facetime calls, video conferencing or the social media app, and streaming media content, and even if users are having a conversation with someone close to them, the company said.
Users can adjust the font size for ease of reading. To make it more convenient for hearing-impaired users to communicate with each other via video calls, the feature in FaceTime attributes the auto-copied dialog to call participants.
On the Mac, live captions come with the option to type feedback and speak louder live to others who are part of the conversation, Apple said. It also claims that live captions will be generated on the device – in terms of privacy and user security.
Apple also introduced some additional accessibility features this week to celebrate Global Accessibility Awareness Day. These include Siri Post Time, which allows users to adjust how long a voice assistant should wait before responding to a request, Buddy control for asking a care provider or friend to play the game, and customizable voice recognition that claims to be customized to recognize sounds. A unique alarm for a person’s home, such as a doorbell or appliances specific to a person’s environment.
Apple launches SignTime to connect Apple Store and Apple Support customers with on-demand ASL interpreters. SignTime is already available to customers in the US using ASL, the UK using British Sign Language (BSL) and France using French Sign Language (LSF). In addition, Apple Store locations around the world have already started offering live sessions throughout the week to help consumers discover the accessibility features of the iPhone, and how Apple supports social channels to show content, the company said.
Read interesting news, reviews as well as tips & tricks on TechnoBugg website, and stay updated with the latest happenings of the tech world on the go with Technobugg App. Also follow on Google News and join our Telegram channel for the latest updates.