Apple today announced a bunch of new accessibility features coming to it’s latest software across the devices, which can improve an average Joe’s life significantly.
Although, they’re directed mainly towards disabled people. These include a live captions support even if the device isn’t connected to the internet, the ability to stream your Apple Watch screen to the iPhone, and nearby door detection. All these could be coming in the upcoming iOS 16.
iOS 16 Accessibility Features
Apple is frequently appreciated by both the critics and the community for it’s range of accessibility features. Expanding this set, Apple announced a few new features today that’d make our lives easier. One of them is Offline Live Captions support.
Under this, Apple enables live captions for all the ongoing calls, and videos, and on most apps to help audio-disabled users to read the live transcriptions to understand the content. This feature’s polished enough to attribute captions to the respective speakers in a group FaceTime call!
Moreover, macOS users can make the device read out loud what they just typed, letting the unclear vision users check the typed content before sending it to someone. Although, this feature is only available to iPhone 11 or newer, an iPad with the A12 Bionic chip or a later model, and a Mac with M1 chip devices.
This is because the live transcription in offline mode needs strong processors to do the job, and Apple’s in-house-made chips are the right ones to do it.
Aside from this, there’s AirPlay coming to Series 6 or newer models to stream the Watch screens to a paired iPhone. Users can view and control the watch features within their iPhones. Also, there’s a Door Detection feature, using the LiDAR support of iPhones or iPads to measure how nearby a door is, know if it’s open or closed, and scan the text written on it.
Though Apple didn’t mention a specific date of launch for these features, we can expect these to come with the upcoming iOS 16 update.
Other Trending News:- News