bifocal

an accessibility case study
October 2022

The full 17 minute presentation for bifocal is below, complete with narration.

If you would prefer a PDF with or without presenters notes, see below the video.

The idea for bifocal came to me during one of the many times I forgot to bring reading glasses out with me. Suddenly I wondered why my phone doesn’t sense that I'm not wearing my glasses and adjust the display so that I can read the screen easier.

There are plenty of clues, after all. No glasses on my face is one big clue. Another is squinting at the phone and holding it far away to try to read it. The TrueDepth camera can sense all of these things.

Exploring the bifocal idea led to two other projects:
  • Improve discoverability of Text Size and Display Zoom options.
  • Improve customization of the iOS Control Center.
These are also covered in the above case study. All 3 of these projects seem worthwhile to me and I enjoyed exploring them. The only problem is that I now forget that these refinements aren’t actually present on my phone when I go to use it!

The benefits of bifocal are clear when seen up close.

I’d love any feedback or advice you have to give.