World Disability Awareness Day saw Apple preview a collection of new features and updates that make life easier for people with disabilities, and help everyone else as well. These come from a combination of advances in hardware, software and machine learning, and will work on the iPhone and iPad.
One of the most interesting of these features is Door Detection, aimed at helping those who are blind or have low vision problems. It is often a challenge for many to navigate the last few feet to their destination. With Door Detection, users will be able to use their devices to get guidance on entrances to spaces.
Door Detection will not only indicate that there is a door, but deliver information on what may be written on it, such as timings. In addition, information includes whether it is open or closed, and when it’s closed; and whether it can be opened by pushing, turning a knob, or pulling a handle. Door Detection can also read signs and symbols around the door, like the room number at an office, or the presence of an accessible entrance symbol. This new feature combines the power of LiDAR, camera, and on-device machine learning, and will be available on iPhone and iPad models with the LiDAR Scanner.
Users can also understand how far they are from the door.
Door Detection will be available in a new Detection Mode within Magnifier, Apple’s built-in app supporting blind and low vision users. Door Detection, along with People Detection and Image Descriptions, can each be used alone or simultaneously in Detection Mode, offering users with vision disabilities a go-to place with customisable tools to help navigate and access rich descriptions of their surroundings. In addition to navigation tools within Magnifier, Apple Maps will offer sound and haptics feedback for VoiceOver users to identify the starting point for walking directions.