Apple Will Be Your Eyes, Ears, and Voice With New Accessibility Features – Review Geek


Hannah Stryker / Review Geek

Apple users with disabilities are getting some good news this week. In a press release, the company announced a slew of upcoming accessibility features that can serve as their eyes, ears, and voice to empower them to navigate the world with greater ease and independence.

The forthcoming software covers various accessibility needs, including cognitive, speech, vision, hearing, and mobility. In the press release, Apple stated it worked in deep collaboration with community groups of various disability types to ensure the features adequately address the wide range of challenges users with handicaps face to make a tangible impact on their lives.

An iPhone and iPad in Assistive Access mode
Apple

One of the most remarkable new accessibility tools is Assistive Access, which serves users with cognitive disabilities. It leverages innovative design principles to streamline core iPhone and iPad features like connecting with loved ones, capturing memories with the camera, and enjoying music. Assistive Access lightens the cognitive load necessary to listen to music, make calls, send texts, and capture and look at photos.

A woman using Live Speech on an iPhone
Apple

Live Speech is a feature aimed at assisting nonspeaking individuals in phone communication. During calls and video chats, users can type messages into Live Speech that are then read aloud to those on the call. The feature even lets you save commonly used phrases for quick access.

Apple

Maybe the most impressive new tool Apple announced today is the Personal Voice. This tool utilizes on-device machine learning to create a synthesized voice that sounds like the user. This will be very helpful for users diagnosed with degenerative diseases such as ALS and allow them to retain their own voice when they lose the ability to speak.

Also worth mentioning is Detection Mode in Magnifier, which offers a form of sight to users who are blind or have low vision. Combining the Camera app, LiDAR Scanner, and on-device machine learning, Magnifier can identify and read text aloud on physical objects like household appliances, providing a more independent and immersive experience.

All of these features and more will be available on iPhone and iPad later this year.

Source: Apple





Source link

Previous articleLoose lips sink ships: Project Titan(ic) engineer charged by DOJ for stealing Apple’s self-driving car tech
Next articleValkyrie Goes for a Leveraged Bitcoin Futures ETF Dubbed ‘BUFD’ in Regulatory Filing