Besides building accessibility into frontend/React component toolkits, how do we automate testing for accessibility? I've turned on text dictation and tested apps with a blindfold, but that doesn't scale and I'm not even sure if it's how people really use an app without sight.
After years of trying, I've still not found a reliable way to automate accessibility testing. The only really workable way to manage it currently is: bake it into your entire dev process.
When designing an application, forget the visuals: design the flow of information, and the interactions. This is a surprisingly good facsimile for mobile-first thinking, as it follows similar principles: in both cases, you have a restricted amount of information to display, and have to design to deal with that.
Once you've got the information flow, step from there to visual elements, and ensure that as you build, you're baking in ARIA support and your testers are interacting with it using VoiceOver/JAWS.
At the end, the fact is you won't have anything perfect, but you'll have something better than the majority of sites out there. The reality is that perfection is impossible, but if you bake inclusive thinking into your app from the get-go, it's pretty straightforward, and you usually end up with an application that is less confusing and overloaded with information for your visual users too.
If you leave it as something to slap on at the end, it's almost always impossible.
All good points there, and agreed about automated testing, I think the most you can hope for in that department is linting level testing (color contrast, valid html, associated labels and form controls, etc.)
The hard things like focus control require manual testing, ideally by a skilled user of AT.
I’d think regression testing is easier than with a GUI. Just interpose between the app and the screen reader, and check for expected strings in the output.