When Technology Gets in the Way: Accessibility Challenges Hidden in Plain Sight Home


When Technology Gets in the Way: Accessibility Challenges Hidden in Plain Sight Draft

Draft: This post is a work in progress. Details may change.
5 min read

Accessibility in technology has progressed significantly, yet many of the most frustrating obstacles are the subtle, everyday interactions that sighted product designers rarely consider. These challenges are persistent, often avoidable, and frequently caused by underlying assumptions that simply don’t reflect the lived experience of blind and vision-impaired users.

One of the most prominent examples is the on-screen keyboard on smartphones, particularly on devices like the iPhone. While these keyboards have matured visually and functionally, they still impose considerable friction when used non-visually.

When Predictive Typing Stops Predicting Your Intent

Modern smartphone keyboards assume the user can instantly glance at the display, interpret visual suggestions, and react. When using a screen reader, that entire loop breaks down.

Predictive text, auto-correction, and suggestion bars often behave in ways that the user does not expect—silently modifying what was typed, overriding words, or inserting something completely unintended.

A very real example: a harmless text message is silently corrected to
“I love you.” No prompt. No warning. No intent behind it. Only a user relying on a screen reader, unaware that the system made an intrusive decision and changed the meaning entirely.

These incidents are not merely uncomfortable. They can cause embarrassment, confusion, and miscommunication in both personal and professional contexts.

The Missing Tactile Feedback Problem

There is an even more fundamental issue beneath auto-corrections: on-screen keyboards are inherently challenging for blind and vision-impaired users because they offer no tactile cues.

A physical keyboard provides tactile boundaries, key shapes, and mechanical feedback. A touchscreen offers none of this—just a flat surface with invisible targets. Without visual reference, it is easy to miss a key by a few millimetres, and the user only learns this after the character is spoken aloud by the screen reader.

This is where the frustration grows. These devices already understand touch patterns, contact size, and temporal sequences of taps. They know which keys are adjacent and how users tend to drift on the keyboard. It is entirely reasonable to expect that the software could infer intent more intelligently: if I type something that vaguely resembles “finger,” the system should not confidently decide I meant “singer” or “dinger.” The context and pattern of touches simply don’t support that conclusion.

The technology is clearly capable of smarter interpretation—it just hasn’t been designed with this scenario in mind.

Why These Issues Persist

Several structural design decisions contribute to the ongoing problems:

1. Suggestion Controls Aren’t Reliably Exposed

Screen readers do their best, but the UI elements for predictive suggestions are not consistently labelled, grouped, or discoverable. Gestures work sporadically, and navigating the suggestions is neither predictable nor efficient.

2. Correction Algorithms Assume Visual Confirmation

Auto-correction is built on the premise that users will instantly see and fix mistakes. For blind users, “fixing” requires extra navigation, gesture actions, and awareness of what the system did—often long after the fact.

3. Over-Aggressive Correction Logic

Mobile keyboards frequently modify text before the user finishes typing. In many cases, these changes are not announced at all, leaving the user unaware of what the system has replaced.

4. Defaults That Don’t Work for Non-Visual Typing

Auto-correction and predictive suggestions are enabled by default. For blind users, disabling them can dramatically improve reliability, but most never encounter guidance that this is even an option.

The Human Impact

These challenges are not theoretical. They affect real communication—in emails, messages, work chats, and personal conversations. Unintended corrections can alter tone, distort meaning, or communicate something wildly inaccurate.

For blind and vision-impaired users who depend on precision and clarity, these silent changes undermine confidence and create friction that should not exist. Accessibility is not only about adding features; it is equally about removing unnecessary barriers.

What Needs to Change

Expose Suggestion UI Controls Properly

Assistive technologies must be able to reliably read, navigate, select, and dismiss suggestions with deterministic gestures.

Improve Intent Recognition for Non-Visual Typing

Typing patterns, context, and spatial proximity should be considered when interpreting input. If the touch pattern points clearly to “finger,” the system should not wander into “singer” territory.

Make Auto-Correction Announcements Mandatory

Any change made by the keyboard should be announced immediately and consistently.

Offer a Screen-Reader-Optimized Typing Mode

A dedicated input mode could provide predictable behaviour, reduced automatic corrections, and tailored feedback loops.

Listen to Real Users

Blind and vision-impaired communities have raised these concerns for years. Formal, ongoing engagement between platform developers and accessibility experts would help prevent regressions and guide practical improvements.

Conclusion

Accessibility is not an afterthought; it is a core measure of whether technology serves all users effectively. Touchscreens, predictive typing, and intelligent correction tools are now essential parts of the mobile experience, but they must be designed with a deeper understanding of non-visual use.

The improvements required are not complex. They simply need attention, prioritization, and the willingness to view accessibility as an integral part of design—not a checkbox on a feature list.

The technology already exists. The need is clear. What remains is the will to make typing on a touchscreen less of an obstacle and more of the seamless tool it was meant to be.