Hacker News new | past | comments | ask | show | jobs | submit login

Would this make it possible to have the whole page in <noscript> tags to make it crawlable by all bots and browsable in Links, Lynx, Dillo, w3m and offByOne?



Yes, that's a good idea. You'd just need to introduce a little wrapper around the client-side rehydrate to move the contents of the <noscript> tag out of the <noscript> before calling React.render.

Then again, you don't really need <noscript> in that case. It's rendering server-side, so the page already displays correctly, you just need to make sure all your links work.


I've been burned by trying to read the contents of noscript nodes before. Some browsers don't support that at all.


Oh wow that's annoying.


Why? The whole content is renderable server-side and all links should work. What purpose would <noscript> serve?


Wouldn't that hide the page for JS clients? Or would the JS client remove the <noscript> tags?


<noscript> is only rendered if JavaScript is disabled.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: