AR’s hardest problem isn’t object recognition—it’s making its coolest features usable by everyone. In his InfoQ-transcribed talk, Google engineer Ohan Oda walks us through the surprising technical gymnastics of turning Google Maps’ Lens view into an audio- and haptic-friendly experience for visually impaired users.
Beyond rewriting rendering pipelines, teams had to overhaul planning, QA and even cross-department priorities to bake accessibility into every sprint. The takeaway? Treat AR as a full-sensory playground, lean on real-world user feedback, and build inclusivity from day one.
Watch on YouTube
Top comments (0)