It's literally just a static page that's dumped out of a JavaScript file, Shopify iframes excluded. But if you want accessibility issues:
* The header image, saying “Ben Eater”, has no alt-text. That would be fine, except the letters are individual SVG paths.
* Likewise, the social media sharing buttons also have no alt-text. A screenreader will just see four images with links at the beginning of the page.
* The YouTube videos are constructed out of multiple links without text each, including SVG images.
* The YouTube video links open into the same window, harming usability. The "target" attribute should only really be used in sites built out of frames, and this is not one of the exceptions.
* My screenreader-type program, in "just read the proper text" mode, misses half of the headings. Outside of this mode, I have to sit through over a minute of drivel before the content, because there's no skip-nav link, and then again for the video links. (This one doesn't quite count, because it's 'cause my screenreader-type program sucks and ignores aria-hidden when I set it to "all". But the lack of skip-nav is an issue.)
* Can't even read the plain, boring old text without JavaScript. This in itself is an accessibility issue.
I got bored at this point, but I'll list some other issues I spotted while looking through the dynamically-constructed DOM:
* Identical SVG images are copied-and-pasted throughout the file, but with different CSS styles – some browsers might waste time re-rendering.
* The page makes seventeen requests even with external JavaScript blocked, eleven of which are to external domains.
* The DOM contains, no joke, eight consecutive </div>s. The removal of some of these wrapping <div>s makes no perceptible difference to the page.
* Mixing and matching semantic and non-semantic HTML tags, confusing certain "reader mode" tools.
And it turns out that this page is actually multiple pages, with no machine-accessible links between them, bundled up into one file stretched across multiple URIs. This is not the proper way to handle caching the next page. I'm struggling to articulate how bad this is.
When I click on the "kits" page, random loading animations partially obscure the top of some of the paragraphs as it tries, and fails, to add some kind of inline purchase widget. There's significant DOM bloat, and displaying the page takes up an entire core of my laptop.
Because of the caching failure I failed to properly articulate earlier, refreshing a page other than the one you initially clicked onto causes a cache miss for all of them, and the retransmission of the HTML page (though fortunately not the thirteen pages worth of stuff, which are stored in the ~0.2MB JavaScript file).
Loading the "kits" page directly and following all of the instructions the page provides (not my default configuration) causes my nice new shiny laptop to max out a core for three seconds while the browser requests the website, waits for the request to complete, parses the HTML, requests the JavaScript, requests the CSS, compiles the CSS, renders the DOM according to the CSS, compiles the JavaScript, runs the JavaScript, repeatedly polls something due to a setInterval call, performs 9 DOM Events (notorious for being the slowest API of the web), recalculates the style for and re-renders the
DOM, completely hangs for 0.3 seconds for reasons unknown, recalculates, re-renders, runs some DOM events, fires off DOMContentLoaded (prematurely, not that the browser has any reason to know that – this might have accessibility impacts), runs more JavaScript to mutate the DOM further, performs several dozen incremental style updates and multiple repaints, fires off another DOMContentLoaded event, fires off the load event, fires off yet another DOMContentLoaded event, and another, and another (in amongst repeated recalculating and repainting of the DOM), fires off two more load events, etc.… Leaving behind one and a half seconds' worth of leisurely garbage collection after the page has finished loading, meaning I can't scroll until 4½ seconds after I click onto the page.
This is the only computing device I own capable of rendering the page in less than five seconds, even with my very fast network connection. Normal websites with an order of magnitude more writing, plus images and links, take an order of magnitude less time to load than this.
This website is not good. But it's nowhere near the worst out there – in fact, this would probably be in the top 60% of pages I use regularly, if I did.
I fixed a number of the low-hanging a11y issues that you identified.
As for using modern web tooling, keep in mind there are many business and technical tradeoffs that go into any engineering decision. In this case, dynamically rendering the content makes it easier in the future to require aspiring hardware engineers to pass a test demonstrating knowledge of WCAG 2.0 standards before unlocking any educational hardware content.
It's much better now; thanks. Most of my remaining complaints are now obsolete or easily avoidable. I think the only remaining issue is the contrast on the Projects, Shop, About, FAQ, Support text, but your existing design might be pretty enough to justify that.
… Wait, those changes applied to all of the pages‽ I'm starting to finally see the appeal of those web frameworks.
I think a big part of the reason why people don't care about accessibility anymore is because you can't make a website without someone pulling out a Bible-length list of asinine complaints, and that is for a site "in the top 60% of pages".
I know that I have stopped caring for "accessibility" almost entirely beyond the most obvious things. Most of these complaints are either totally inane and unnoticeable, or will affect so few people over the site's lifetime that I can count them on one hand. I think it's good for people to try and make your website as accessible as possible, but there's a limit to how hard you can indignantly demand people bend over backwards for you.
Most of the issues I've listed aren't accessibility problems. They're caching, efficiency and usability problems. Only the first six are accessibility issues.
The list contains “inane and unnoticeable” issues because I was checking it manually, only spotting what came to mind. This is because external accessibility tools cannot check this website for accessibility issues, because they cannot access it.
I did not pick up on, for example, the issue of content flashing up on the screen in distinct parts, hitting the "three flicker limit" heuristic for seizure safety. Sure, that's only going to effect very few people, and it's fairly low contrast compared to other sites, and on a high-end gaming PC this would all occur within a single frame (making the point moot), but it's still something to consider and attempt to minimise.
Personally, I'm more concerned with the bandwidth; the fact that the page makes requests to Google's CDN for images and fonts that should be hosted on the same domain (and ideally provided on the same connection); and the ridiculous amount of processing my browser has to perform to draw the page. There are many more issues with the page than just a11y.
* The header image, saying “Ben Eater”, has no alt-text. That would be fine, except the letters are individual SVG paths.
* Likewise, the social media sharing buttons also have no alt-text. A screenreader will just see four images with links at the beginning of the page.
* The YouTube videos are constructed out of multiple links without text each, including SVG images.
* The YouTube video links open into the same window, harming usability. The "target" attribute should only really be used in sites built out of frames, and this is not one of the exceptions.
* My screenreader-type program, in "just read the proper text" mode, misses half of the headings. Outside of this mode, I have to sit through over a minute of drivel before the content, because there's no skip-nav link, and then again for the video links. (This one doesn't quite count, because it's 'cause my screenreader-type program sucks and ignores aria-hidden when I set it to "all". But the lack of skip-nav is an issue.)
* Can't even read the plain, boring old text without JavaScript. This in itself is an accessibility issue.
I got bored at this point, but I'll list some other issues I spotted while looking through the dynamically-constructed DOM:
* Identical SVG images are copied-and-pasted throughout the file, but with different CSS styles – some browsers might waste time re-rendering.
* The page makes seventeen requests even with external JavaScript blocked, eleven of which are to external domains.
* The DOM contains, no joke, eight consecutive </div>s. The removal of some of these wrapping <div>s makes no perceptible difference to the page.
* Mixing and matching semantic and non-semantic HTML tags, confusing certain "reader mode" tools.
And it turns out that this page is actually multiple pages, with no machine-accessible links between them, bundled up into one file stretched across multiple URIs. This is not the proper way to handle caching the next page. I'm struggling to articulate how bad this is.
When I click on the "kits" page, random loading animations partially obscure the top of some of the paragraphs as it tries, and fails, to add some kind of inline purchase widget. There's significant DOM bloat, and displaying the page takes up an entire core of my laptop.
Because of the caching failure I failed to properly articulate earlier, refreshing a page other than the one you initially clicked onto causes a cache miss for all of them, and the retransmission of the HTML page (though fortunately not the thirteen pages worth of stuff, which are stored in the ~0.2MB JavaScript file).
Loading the "kits" page directly and following all of the instructions the page provides (not my default configuration) causes my nice new shiny laptop to max out a core for three seconds while the browser requests the website, waits for the request to complete, parses the HTML, requests the JavaScript, requests the CSS, compiles the CSS, renders the DOM according to the CSS, compiles the JavaScript, runs the JavaScript, repeatedly polls something due to a setInterval call, performs 9 DOM Events (notorious for being the slowest API of the web), recalculates the style for and re-renders the DOM, completely hangs for 0.3 seconds for reasons unknown, recalculates, re-renders, runs some DOM events, fires off DOMContentLoaded (prematurely, not that the browser has any reason to know that – this might have accessibility impacts), runs more JavaScript to mutate the DOM further, performs several dozen incremental style updates and multiple repaints, fires off another DOMContentLoaded event, fires off the load event, fires off yet another DOMContentLoaded event, and another, and another (in amongst repeated recalculating and repainting of the DOM), fires off two more load events, etc.… Leaving behind one and a half seconds' worth of leisurely garbage collection after the page has finished loading, meaning I can't scroll until 4½ seconds after I click onto the page.
This is the only computing device I own capable of rendering the page in less than five seconds, even with my very fast network connection. Normal websites with an order of magnitude more writing, plus images and links, take an order of magnitude less time to load than this.
This website is not good. But it's nowhere near the worst out there – in fact, this would probably be in the top 60% of pages I use regularly, if I did.