Apple on Tuesday unveiled an expansive slate of new accessibility features coming later this year across its devices, including iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro. The announcement, released ahead of Global Accessibility Awareness Day on May 16 and just weeks before the company's WWDC 2025 event, introduces tools integrated into iOS 19, macOS 16, and watchOS 12 that are designed to aid users with physical, visual, hearing, and cognitive disabilities.

One of the most significant advancements is Apple's support for Brain Computer Interfaces (BCIs). A new protocol will allow users with severe mobility disabilities to control devices using neural signals captured by brain implants. This is part of Apple's Switch Control framework and comes through collaboration with Synchron, a medical device company focused on BCI technology.

Also notable is the debut of Accessibility Nutrition Labels on App Store product pages. The labels will list supported features like VoiceOver, Voice Control, Larger Text, and more, allowing users to evaluate apps before downloading. "We're providing more transparency and elevating the work our CVS Health teams do to create great experiences for all consumers," said Eliel Johnson, Vice President of User Experience and Design at CVS Health.

Apple is also launching a Magnifier for Mac, enabling users to connect an iPhone or USB camera to magnify text and objects on their Mac's screen. The app allows customized views with adjustments for contrast, brightness, and color, along with text recognition capabilities.

The new Accessibility Reader is a systemwide reading mode aimed at users with dyslexia, low vision, or cognitive conditions. Available across iOS, iPadOS, macOS, and visionOS, the reader allows extensive font, spacing, and contrast adjustments, and can be invoked from any app.

Braille Access, another flagship feature, turns Apple devices into fully functional braille note takers. It includes a built-in app launcher and support for math via Nemeth Braille. Users can open BRF files directly and even use real-time captions on braille displays, integrated with Apple's Live Captions service.

Live Captions will now expand to the Apple Watch for the first time. Users can view live transcripts of conversations directly from their wrist and control sessions remotely via the watch interface.

Apple Vision Pro users will benefit from expanded accessibility through a new Enhanced View mode, magnifying surroundings using the device's camera system. Updates to Zoom and VoiceOver enable text reading, object detection, and live environment descriptions.

The Personal Voice feature, introduced in 2023, has been overhauled in iOS 19. Instead of recording 150 phrases, users can now generate a personalized voice with just 10 phrases in under a minute. Apple says the result is "more natural-sounding" and will also support Spanish (Mexico).

Additional updates include:

  • Eye Tracking improvements for easier keyboard interaction on iPhone, iPad, and Vision Pro.
  • Background Sounds customization with EQ controls and new automation options.
  • Vehicle Motion Cues coming to Mac, enhancing comfort for motion-sensitive users.
  • Sound Recognition enhancements, including Name Recognition and integration into CarPlay for alerts like sirens or a baby crying.
  • Assistive Access for Apple TV, now with a simplified media player and developer APIs for custom interfaces.
  • Music Haptics with adjustable intensity and vocal-specific feedback.
  • Voice Control now includes a programming mode in Xcode and adds language support for Korean, Arabic, Russian, and others.
  • Live Captions expands to new regions and languages, including Mandarin, Cantonese, French, German, and Japanese.
  • A new Share Accessibility Settings feature allows temporary transfer of custom settings across iOS and iPadOS devices.

"At Apple, accessibility is part of our DNA. Making technology for everyone is a priority for all of us," Apple CEO Tim Cook said in a statement.