> Windows might not be trendy among developers, but it’s where accessibility works best. I don’t have to worry about whether I can get audio working reliably. The NVDA screen reader works on Windows and is free and open source, actively maintained, and designed by people who are screen reader users themselves.
> That said, I’m not actually developing on Windows in the traditional sense. WSL2 gives me a full Linux environment where I can run Docker containers, use familiar command-line tools, and run the same scripts and tools that my colleagues use. Windows is just the accessibility layer on top of my real development environment.
> I use VS Code. Microsoft has made accessibility a core engineering priority, treating accessibility bugs with the same urgency as bugs affecting visual rendering. The VS Code team regularly engages with screen reader users, and it shows in the experience.
His comments on VS Code reminds me of the quote "Good accessibility design is mostly just good design":
> Consistent keyboard shortcuts across all features, and the ability to jump to any part of the interface by keyboard
This is something I notice and appreciate about VS Code as a fully sighted person. Just like I appreciate slopped sidewalk cutouts when I'm walking with luggage.
A11y is a big commitment and cost, and of course not all a11y features benefit everyone equally, but it has a larger impact than most people realize.
Yup, it does. I was an early adopter of VS Code. It has been extremely satisfying seeing the progress they've made with accessibility. I provide feedback on a semi-frequent basis. Nowadays its only to flag regressions.
I've been working on polishing accessibility features for a hobby web app, mostly out of curiosity to see what a deep dive would look like.
Some of it is definitely UX polish, like if a button can be pressed to toggle a sidebar, then ESC should dismiss the sidebar and return focus to the button that toggled it. And when the sidebar opens, focus should be moved to the top of the sidebar.
Though you can also get trapped in a fractal of polish-chasing. When it comes to screen readers and live-content like a rich chat app or MUD client, I'm not sure how you would target anything more broad than, say, Safari + VoiceOver on macOS and then some other combo on Windows. You quickly realize the behavior you see is an idiosyncrasy in the screen reader itself.
> Though you can also get trapped in a fractal of polish-chasing
I think this applies to anything :)
> You quickly realize the behavior you see is an idiosyncrasy in the screen reader itself.
Yeah this is definitely a pain point during development. There is standardization and efforts to reduce these differences though, so I hope this gets better over time.
MS has to make accessibility a priority because it's mandated by government (its customer).
Smaller companies would benefit from better libraries and design systems that make it easier to incorporate accessibility. Make accessible the default.
> Windows might not be trendy among developers, but it’s where accessibility works best. I don’t have to worry about whether I can get audio working reliably. The NVDA screen reader works on Windows and is free and open source, actively maintained, and designed by people who are screen reader users themselves.
> That said, I’m not actually developing on Windows in the traditional sense. WSL2 gives me a full Linux environment where I can run Docker containers, use familiar command-line tools, and run the same scripts and tools that my colleagues use. Windows is just the accessibility layer on top of my real development environment.
> I use VS Code. Microsoft has made accessibility a core engineering priority, treating accessibility bugs with the same urgency as bugs affecting visual rendering. The VS Code team regularly engages with screen reader users, and it shows in the experience.