Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Why don't websites have 'text only' backup versions?
53 points by innovator116 on May 14, 2015 | hide | past | favorite | 59 comments
I am a long time lurker here, though not a programmer, but a system admin and professional manager, open source enthusiast. A trend which I am seeing is that websites may use latest programming paradigms and designs to make them beautiful and stunning, yet they are so engrossed in javascript and 'awesome' interactive graphics and images, that a website cannot be viewed without javascript and images disabled. To save bandwidth or during slow wireless speeds, it is desirable to run browser without javascript and images disabled but websites don't have any kind of 'text only' versions for such use. In an ideal situation, every website should have 'text only' version which can be opened in CLI browser like Links. As an example, this startup http://qfusionlabs.com/ website looks like blowing a bubble without javascript and images disabled. Edit: I am glad that Hacker News works perfectly without javascript and images disabled! Why can't all of discrete websites be like this.


The "proper" approach to this is 'progressive enhancement', ie. that every site is already a text only version. Then, on top of that, you can add images, stylesheets, Javascript, etc. as necessary. This used to work well, since that was also the easiest approach to building a site (unless you used Flash...)

I think frameworks have inverted things: the easiest approach these days is to load the default config of some framework/CMS, which will make heavy use of Javascript/images/CSS/etc. in order to entice developers to use it (otherwise, why use a framework at all ;) ). In this world, turning off a feature takes more effort than leaving it on, and we end up with ideas like special "text only" alternatives.

I think there's definitely a burden on the developers of frameworks to make them degrade as gracefully as possible. Of course, this isn't always possible (especially those designed to be completely in client-side JS), but in those instances where it is possible, it can have a large impact. For example, if the developer of some popular Wordpress theme spent a little extra effort on graceful fallbacks, it would improve the situation for all sites using that theme.

Disclaimer: I used to develop a CMS with crazy-strict adherence to, among other things, accessibility standards ;)


What CMS do you develop? This sounds like something I would use.

I always browse with cookies and Javascript disabled. There are very few sites I'll enable one or both to use. Most of the time, the first time I browse a site that requires Javascript to show anything useful I just leave. If you can't make your pitch with text, I'm pretty sure I'm not going to be interested.

I know, I know...I'm not the typical market for those kinds of sites.


I was a developer of http://ocportal.com a few years ago. I can't take much credit for it though, since my contribution was a drop in the ocean compared to that of Chris Graham.


Well done. Seems to work reasonably well without JS, which is more than far too many websites can say.

Well, except for the SWF animation at the top, and the dropdown menus, but meh.


It depends on your business model.

Model 1: ecommerce. The website exists to sell your product. Having an alternate version is a simple cost/benefit decision: will people buy your thing from a low-overhead site? This may be combined with a mobile-friendly rendition.

Model 2: advertising. The website exists to catch attention long enough to show ads. You need seven tracking systems and eleven ad networks; all of them need JS and graphics and won't make money for you otherwise.

Model 3: public service. The focus is on providing information, not on making a sale or showing ads. The benefit of a low-overhead version is clear, but you need to keep the costs low, so you can't spend much extra time or money on it.

Model 4: SAAS. The website is the service, so user satisfaction is the top concern. Understand how your users want to use your service, and provide that for them.


Because it isn't worth it, basically. It's a relatively simple economic decision - making a text only backup version takes time, programming, adds an extra possibility for failure, etc. etc. - and the people who want a text only version are an extremely small minority.

I know that sounds a little heartless and "not caring about the web", but it's the reality.


It depends on the site, but it's really not that much effort for any site that a viewer would WANT a text-only mode.

Sites with lots of text content invariably use a CMS to manage that content. It's not hard to build a text-only template, and in fact many sites actually do this - if you browse on mobile Safari, it's called "Reader Mode". I don't know if Android has an equivalent, but I would assume so.


I think you've acquired a mistaken picture of Reader Mode. Reader Mode (like the Readable and Readability bookmarklets) presents an alternative rendering of the same set of files from which the standard view is rendered: it does not cause the server to serve an alternative set of files.

Reader Mode and the Readable and Readability bookmarklets fail often even on "text-centric" pages such as blog posts.


Because the business case isn't there, in terms of the cost/benefit. It's a lot of work (maybe not initially, but maintaining that functionality on an ongoing basis slows development), to support a tiny number of users, who you can't make any money advertising to (if you're ad-supported) and who are probably not the kind of users who buy much (if you sell directly).


The sad bit about this attitude is that some of us consider web design/development to be a craft, rather than purely an exercise in profit-maximisation. We should be creating the very best work that we can, not just aiming for what's good enough, then stopping. That's just an opinion; I realise it's unpopular with some.


It is not unpopular. It is unrealistic.


Overengineering is the biggest problem with the craft today. True craftsmanship is about giving the customer something appropriate to their needs, not making a magnum opus every time.


As someone who lives without constant broadband and as someone who works with people who enjoy accessibility settings in software, I think the current web design trend you describe is disastrous.


It is an awful experience. Mobile data is often sketchy, but these bloated sites don't render a single thing until all the twenty tons of front end crap is downloaded, and all to view a tiny static chunk of text. But hey, they got to use react, a metric shit ton of the latest CSS wankery, and it is connected to every social meejia platform!


Silently downloading updates in the background is another frustrating trend. I've had data caps get blown not due to my actions, but due to updates.

I think good design should include designing for the those with impairments or those who don't live in a big city with uncapped broadband.


I think main reason is that number of users with JS disabled these days is just too small to care.

Another thing is even if the effort of enabling is small, people responsible for the website are not even aware of the problem.

As an experiment I would drop a note to a few websites broken with JS disabled about a problem, providing a reasoning why it should be fixed and check how many respond.


I agree. The only people who disable JS are informed, and who wants those kinds of people visiting their website?! Ha! Good riddance, I say!


When are we going to get over the idea of turning off javascript? No one asks "why doesn't my Java app run properly when I turn off the JVM?" I would argue that the open web is getting to that point with javascript.

JS is an open standard, highly performant, and implemented well by almost every single browser out there. The only people I know who regularly browse without JS are old-timers who got into the habit 10 years ago, before Spidermonkey, V8, Nitro, JavascriptCore etc. revolutionized performance, and before AJAX revolutionized architectures.

Let me propose a different way of looking at things. The concerns you actually expressed are performance, download size, and CLI browser compatibility.

Performance - I don't know of a reason that javascript apps cannot be performant enough, even on low-bandwidth connections. AJAX was invented to improve performance over static websites, by reducing network traffic to just the bits that change with each user action.

Download size - Javascript obviously has nothing to do with how many images or videos a site embeds.

CLI browser - Why can't the CLI browsers implement a javascript engine, and then render the results, just like any other browser? There is nothing about JS that requires a GUI.

edit- speling


Running through:

W.r.t. "old-timers who got into the habit 10 years ago". A decade ago I was 11. Not exactly an old-timer.

W.r.t. performance, in an ideal world, you'd be absolutely right. But this is not an ideal world. There are far too many websites that are substantially less performant with JS. (Case in point: Gmail.). There are far too many websites that kill my battery life on my laptop via sloppy coding. There are far too many websites that don't stop downloading in the background to refresh things that I don't need refreshed, and in the process kill my bandwidth cap on my home connection. There are far too many websites that take multiple seconds before they even begin to render, because they are waiting on some library to load before they parse things clientside. Looking at you, client-side markdown. There are far too many websites that are less responsive on the "responsive" version with JS than with the fallback without JS.

W.r.t. download size, again, in an ideal world, you'd be absolutely right. But this is, again, not an ideal world. There are far too many websites that pull down multiple MB of libraries before they even start to load. Sometimes they are cached, but far too often they aren't. There are far too many websites, again, that refresh things that I don't need refreshed and as such continue to use bandwidth.

And you're missing two other things: the two concerns that I consider most important. One is security. The majority of browser exploits require JS, especially the nastier ones. As you say, JS engines focus heavily on performance. And trying to get high performance out of a JITted language goes intrinsically against security. For a certain amount of dev time, you can get decent security and decent performance, great performance and weak security, or great security and weak performance. It's an intrinsic trade-off. (The other one is user tracking, but I'm not going to get into that one here.)

And one other thing: with JS disabled, most of the time if a webpage is loaded it's actually loaded. I can keep it up and sit in a bus or something and read it, without an internet connection. With JS enabled, far too many websites end up breaking nastily some time down the line when they go to update something and fail. Again: not an intrinsic, but something far too many websites do anyways.


From a dead user, Diti, whose sentiments match my own:

> To me, any website implementing good accessibility (http://www.w3.org/WAI/) is likely to be perfectly viewable and browsable per what you said. But most webmasters don't know about WAI.

I'd add "or don't care".


It's cheaper to make a fancy box around a pinecone than to deliver a Fabrege egg. Content that stands on its own is harder than a flock of spinning googaws. That's why Facebook fills your feed with CandyCrush.


This reminds me of the rarely-used print stylesheet[0]. In fact, the only place I've ever seen it in use is on websites run by the Dutch government (perhaps other governments do so too, but I've never needed to check them out).

EDIT: A bit of online searching mainly shows people *complaining8 about print sheets, so perhaps I'm completely out of the loop and simply haven't printed anything in a long time.

It's not much - just stripping out menu bars and other stuff that makes no sense in print, plus a few typographical changes, such as a change in font from sans serif (screen) to serif (paper).

Similar to the other comment, there isn't a clear business case for it very often, but whenever there's a website with a "print version"-link to their articles I don't see why they wouldn't do this instead. Well, aside from the fact that nobody would expect this functionality to be there, which isn't minor I guess.

[0] http://alistapart.com/article/goingtoprint


Among my small circle that used to be the habit when sharing links, share the print link. News sites used to have it a lot, but I don't see it so much anymore.


print links generally don't use the print media css. Usage of print media css means the site prints something different than is what is seen on the screen.


print stylesheets can be a real pain in the ass to test properly.

at least before chrome web-inspector got the handy checkbox.


The lazy answer is that >95% of users have images and JavaScript enabled, and of that 95%, those who have a slow connection are getting the same experience across a number of websites, so they probably won't be turned off by a slow site.

The better answer is that unless you think about it from the start, providing a no-JS fallback can be hard to do well and may require a fair bit of re-architecturing of your website or web app - something that you probably won't bother doing to do for a single digit percentage of your visitors (text-only support probably has a lower priority than supporting <IE9 and Opera Mini). Finding out that images won't work is also pretty hard - they're no way that I know of finding out if images are enabled or not without using JavaScript, which is probably disabled as well.

I've actually been creating a reasonably complex web-based application recently, and for fun every few weeks I test it using Lynx[0]. Technically, using forms and very basic CSS results in a fairly usable service. I managed to get a working CSS3-only (no JS) fallback for tabs, some toggle buttons and some other UI goodies. The only problem is that I end up duplicating a lot of stuff on the backend of the app (because I have to deal with both form submissions AND ajax), and there are a lot of things that you can only really do with JavaScript and images (e.g. games, most interactive stuff without reloads, anything to do with images obviously).

EDIT: regarding advertising, rarely do (effective) advertisers online want the largest audience specifically, they want the most engaged audience. Is this no-JS and no-images user interested in downloading an app or a pop album? Probably not, so it's not worth putting in the effort to make them see your ads.

[0] http://lynx.isc.org/


To me, it seems almost as if the web dev is saying "if you don't have the latest, most up-to-date, most powerful device/whatever, then we don't want you looking at our ads". Seriously? I thought the idea of advertising was to get as many eyeballs as possible. That way of thinking seems so counter-productive to me, plus I can't imagine the advertiser being very happy with that approach. Thinking back to my days working in recording studios, we always mixed stuff in a way that it would sound good on both a giant audio system or a small, cheesy boom-box, with the idea that we wanted everyone who listens to have a good experience no matter what system they were using. I'm not quite sure why this methodology can't be applied here.


That's not the case when supporting IE < 10.


Some do, it's called an RSS Feed.


Far too often lately I've been seeing "RSS feeds" that are nothing but the title and a link to the webpage.


If you are a manager, then you will surely understand.

It's mostly because the website owner/responsible believes that the cost of producing a "text only" version would be bigger than the benefits achieved. If the decision is right or not, I wouldn't know - and, frankly, neither most of people who make the decision (albeit being "confident"). I seriously doubt that any thorough analysis is done on the subject, people just believe that almost everyone uses JS and images. And they might be right, or not.

Sometimes it might also be a matter of ignorance. It might happen that the responsible is not aware of this question (and neither is "made aware" by the technical people).


Yes, this very much annoys me.

It's not text-only versions we need, but sensible use of progressive enhancement. I'm totally fine with 'web apps' requiring Javascript and CSS (but please, at least give me a message to that effect, and don't leave me with an infinite spinner!) but simple web pages are increasingly broken without scripting.

It's frustrating, because it's not like progressive enhancement is hard, either.


It basically amounts to two alternatives:

1) they make it intentional for whatever purpose - stop scrappers, prevent search engine indexing, pull in ads etc.

2) they make it out of incompetence - probably somebody with insufficient web development skills just used some popular JavaScript framework to quickly cook up a site

Now the majority of the customers won't care, but if you're targeting a professional audience, this will not score you any points. If you were a software development company hiring and I were to reply to your invitation, I would first go to your website. If I saw a blank screen in my browser, my first question to you would be why a serious company does have a broken site. Depending on what I heard I might skip you altogether.

Personally, I don't think there are any difficulties in doing a "text-only" version as you call it. That's how I always approach things - do a classic version then add some gradual enhancements. I can't imagine doing it the other way around actually.

And yes, you can call me out of touch with the times, but text pages rank higher and more credible in my eyes then all of the JavaScript toys I see around.


>>> every website should have 'text only' version which can be opened in CLI browser like Links.

every website has a target audience and it simply doesn't make sense to spend time and effort for this support. It's just not worth it.


(Playing devil's advocate marketing guy) Why would I allow you to see my product being website, web-based service or simply my advertising without the full experience? If I offer options to see a stripped down version then I am creating a consumer that will talk about the stripped down version and it will taint my brand. If you were to go to my site then when people ask you about it then I want you to tell them that your actions gave a non-optimal experience. (Non-optimal being my opinion of what I think you as a customer should be and not you a reasonable person should be)


SEO is my first thought. Have you ever tried to write a web spider from scratch? The various ways that websites are built can make scraping meaningful data VERY difficult. I recall one particular site that didn't even store the price for an item in the same parent container, so there was nothing (from a markup standpoint) that linked the product and its price... you HAD to view it with JS enabled.

I think down deep, the author is looking for the web to go back to the http://www.csszengarden.com days. I really wish web designers would put forth the effort.


To answer that, you have to ask yourself why somebody would not want to create an abstraction between content and presentation for their website. It's a basic aspect of modern web design, and even when it gets implemented, it's only ever used to present in one way. Why would someone do that? I'm sure there are a multitude of answers, the most basic of which is "they didn't feel like it." They probably didn't feel like it because they didn't think their efforts would be rewarded, or it wasn't personally fulfilling to them.


It used to be an SEO technique, to make sure that the important parts of a page rendered without javascript. I think the Google is now running javascript in its spider, so that is less of a factor now.


Well that Google can index non-gracefully degrading sites is ok. How Google wants to rank these sites is an other interesting question.


Page load performance is actually one of Google's top signals for ranking nowadays.


What you're describing sounds like an edge case and goes against the design & development methodologies used in today's websites. Rarely are we building things with static content, and we're leveraging the power of AJAX to minimize the number of times a browser must refresh the entire window to load new content.

Current trends aside, from a pragmatic perspective, I would anticipate something like this increasing development costs (labor & money) between 10-50%, and would likely have a very low ROI.


Completely disagree with this.

- Most sites contain static content and are not interactive. Things like simple forms don't really count, in my opinion. - Sites which load static content are in my experience slower as a result of using AJAX, not faster

In fact, I'd argue that there is essentially no additional cost in building a site that uses progressive enhancement for loading content. If anything, my experience is it encourages a much more sensible architecture.

Of course, the value proposition changes when we're looking at interactive web apps, and I agree it's not clear that there's value there.


> Why can't all of discrete websites be like this.

Because most people never turn javascript off OR use links to browser the web.

Of course javascript-less websites have many advantages: easier end to end testing, speed of execution, loading speed,work on a wider range of browsers without hacks...


I guess that it is too much work, and perhaps because of ads. However, I believe there should be a project for a server that does a url screengrab and runs it through some scanning and OCR and perhaps some AI, that accomplishes what you are asking.

(Any takers?)


Readability already parses web sites to provide a simple reader view:

https://readability.com/


It doesn't do "just visit our page, enter the URL and you can read the other site without having the javascript on, and even if the original style sheets are wrong. Oh and we'll serve you some of our non-interactive adds."

Thinking about it, I would be ready to use such a site for the "unreadable" sites.


Not the answer to the question, but that particular site is somewhat (a little of the text and one or two images) readable without the JavaScript if you turn the styles off.

I've just tried it with JS, actually it's a very little of the whole content.


But then you have sites like Google Groups and some Blogger sites (though GMail used to work in lynx, though I haven't tried it recently).


Yes, Blogger and Google Groups are really annoying.

GMail still has a special "basic HTML" version. It behaves much better over the bandwidth limited or very remote connections. Even at my home, with otherwise high speed internet, as I had some transient transmission problems the "full" version wasn't usable, but HTML one worked.

At least GMail has the real "html-backup version" the OP asked for.


That's not the way to help people with low bandwith or very high latency connections. They need more javascript, not less.

SPA designs can actually make CRUD apps significantly more useable on slow or unreliable connections.


[Citation needed]

I've run across far too many "mobile-friendly" websites that start by loading many MBs of JS libraries. And just flat-out fail to load if any of said libraries fail to load.


I'm talking about bad connections, not necessarily mobile. This applies especially to situations that companies like G-Advetures face. In South America they have to deal with perfectly fine desktops, but very high latency internet connections. Submitting a form can take 20 minutes or more.

A text only version like the one described would be far worse than a JS heavy one. JS free forms are naturally synchronous. So each edit would kill the ui for a full refresh cycle.

On the other hand a SPA type page can send data in the background white you're doing other work in the app.

Sure there are crappy mobile friendly sites, but that doesn't invalidate the idea.

On a side note I'm not really sold on the whole CDN idea, it just looks like another HTTP request and another point of failure. Concatenate / uglify seems like the better solution.


The idea would be that for a web app, posting or fetching little bits of JSON every time an action is taken would be less intense than getting a whole page every time. Of course, as you have experienced, that's not helpful if the initial page load is too heavy. Also it's not too helpful for sites that have content to read, rather than actions to take.

ps. "citation needed" is kind of obnoxious when it's something you can think through for yourself.


Older Opera browser versions (and Firefox/Chrome with plugins) allow you to deactivate CSS. Then you see the website as you want it.


That's still a built-in Firefox feature: View->Page Style->No Style. The same menu will also let you choose between different "alternate style sheets" for the same site. (I think it may even be sticky, per-page or per-site.) Just for example, you could try it out here: http://www.slimy.com/~steuard/teaching/tutorials/Lagrange.ht...


Sites need to show ads to get some more money, not all of them but...


> I am ... not a programmer

which is obvious from your question ;)


So, the end user of the site has no say? They should be thankful they are even allowed to cast their eyes on such a technical masterpiece? Developer or not, people can spot a worrying trend of website "developers" who cannot develop - they just bolt bits and frameworks together without a consideration for the end users.


Because the modern "developers" don't know how to do so. They barely manage to bolt together a backend that scaffolded everything magically for them, to react, less, and other "magic" they followed pavlovian style from blog postings.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: