Signed in as:
filler@godaddy.com
Signed in as:
filler@godaddy.com

accessibility first camera app created by a low vision user
Black Lens is an accessibility-first camera app designed to make outdoor photography possible for people with low vision. Built by Steve Johnson, a severely sight-impaired iPhone user, Black Lens solves a simple but deeply frustrating problem: you can’t take a selfie in bright sunlight if you can’t see the screen.
Using a combination of haptic feedback, spoken cues, and a high-contrast “Mask Mode” inspired by Top Gun’s HUD lock-on system, Black Lens enables hands-free, glare-proof photography without relying on visual input.
Originally created as a personal tool to help Steve capture photos independently while kayaking and spending time outdoors, Black Lens has evolved into a powerful solution for the global low-vision community—and for anyone who has ever struggled to take photos in bright conditions.
Black Lens is available now on the App Store.
Black Lens was created by Steve Johnson, a severely sight-impaired iPhone user and full-time cane user who refused to accept that taking a simple outdoor photo should be impossible.
For most people, taking a selfie is effortless. But for Steve, bright sunlight completely washes out the screen, making it impossible to see the camera preview or frame a shot. As someone who loves kayaking, paddleboarding and exploring the outdoors, he found himself unable to capture moments independently. Either the photo was guesswork, or someone else had to take it for him.
No existing camera app solved this. None were designed for people who can’t rely on vision to use a camera. Mainstream developers weren’t thinking about the low-vision experience — because they don’t live it.
So Steve made the decision to build the solution himself.
Rather than writing every line of code in a traditional way, he turned to modern app-building and AI-assisted development tools and pushed them as far as they would go. What those tools couldn’t do for him was the most important part: defining how the app should behave for a blind or low-vision user in the real world.
Steve designed the interaction model, the accessibility logic and the flow of the experience: how the app should use haptics, when it should speak, how Mask Mode should look and feel, and how the interface should adapt when VoiceOver is on or off. He iterated based on his own daily use, refining the experience until it worked reliably outside, on the water and in bright sun.
The breakthrough came when Steve realised he didn’t need to see the screen at all. Using a combination of haptics, audio feedback, an adaptive interface and a high-contrast “lock-on” system inspired by Top Gun’s targeting HUD, he created a camera experience that works even when vision doesn’t.
What began as a personal accessibility project quickly became something bigger — a way to empower blind and low-vision users globally to capture their own moments with confidence, independence and dignity.
Steve built Black Lens because he needed it. Now, he’s sharing it so others can experience that same independence.


A bright, full-black display removes glare completely. A bold green circle highlights any detected face; when the app is ready to take the photo, the circle turns red—an intuitive “lock-on” signal inspired by Top Gun’s HUD targeting system.
Black Lens detects a face, waits for stillness and automatically takes the photo without any need to tap the screen. A five-second preview appears between shots, and the app continues taking photos until the user stops it.
• **Mask Mode (High-Contrast Sunlight Mode)**
A bright, full-black display removes glare completely. A bold green circle highlights any detected face; when the app is ready to take the photo, the circle turns red—an intuitive “lock-on” signal inspired by Top Gun’s HUD targeting system.
Gentle vibrations indicate when a face is near the boundary of the frame. Stronger haptics warn when the face moves out of view. This allows users to frame a photo without seeing the screen at all.
Tap anywhere to take a photo. Swipe up/down to switch cameras. Swipe left/right to toggle Mask Mode. Large, simple, reliable interactions.
For users who prefer not to rely on VoiceOver, the app provides its own clear spoken announcements: “face detected,” “ready,” “lock-on,” and “taking photo.” Photography becomes audio-guided rather than visually guided.
• **Adaptive VoiceOver Interface**
When VoiceOver is on, Black Lens reveals additional on-screen controls with clear labels and rotor-friendly navigation. When VoiceOver is off, those elements disappear for a cleaner low-vision layout. No multi-finger gestures required.
• **Rear-Camera Selfies (Sharper Images)**
Black Lens uses the iPhone’s superior rear camera so users can capture high-quality selfies and group shots even if they cannot see the preview
When VoiceOver is on, Black Lens reveals additional on-screen controls with clear labels and rotor-friendly navigation. When VoiceOver is off, those elements disappear for a cleaner low-vision layout. No multi-finger gestures require!
Bold UI elements, high-contrast colour choices and uncluttered screens make Black Lens easy to use for a wide range of low-vision users.
One-time purchase. No login. No tracking. No ads. Photography should be private, simple and focused on the user—not their data.
Use the iPhone’s sharper wider rear camera to take a superior selfie.
Taking a simple photo outdoors is something most people never think twice about. But for millions of low-vision and blind users, it can be one of the most frustrating tasks on the iPhone.
Bright sunlight washes out the screen. Glare makes the camera preview disappear. Even with accessibility features enabled, a user can’t tell where the camera is pointing, whether they’re in frame, or if the shot has even been taken. Mainstream camera apps are built on the assumption that the user can see the display—and can tap small, precise on-screen buttons to control it.
For people living with sight loss, this often means avoiding photos altogether, relying on others to take them, or settling for blurry, misaligned shots taken through guesswork.
Black Lens was created to fix this exact problem.
Unlike traditional camera apps, Black Lens has **no tiny on-screen buttons**. Instead, it gives the user full control through simple, eyes-free gestures anywhere on the screen:
- **Swipe up or down** to switch between the front and rear camera
- **Swipe left or right** to toggle Mask Mode
- **Tap anywhere** to take a manual photo
- Or do nothing and let **AutoCapture** detect a face and take the photo automatically
This means users never have to hunt for a button, locate an icon, or try to line up a tap on a washed-out display.
Instead of depending on visual information, Black Lens is built around haptics, audio and high-contrast feedback. It transforms the iPhone camera into a tool that works even when the screen is unreadable. Users can frame a shot through vibration, know what’s happening through speech, and capture photos hands-free.
Black Lens removes the need to see the screen entirely. It replaces guesswork with clarity, frustration with independence, and barriers with confidence—allowing blind and low-vision users to capture moments outdoors, on the water, on the slopes, or anywhere bright light makes traditional camera apps unusable.

meet Black Lens and Steve Johnson in this 60 second overview.

the face is still and the phone is steady, auto capture is triggered the circle turns red and the count down is announced

The gestures are simple and the user face is high contrast

a photo is taken a preview is automatically displayed for five seconds if the user does nothing the preview disappears and the camera continues to auto capture
BlackLens