Hacker News new | past | comments | ask | show | jobs | submit login

On a completely unrelated note, I'm so surprised that static sites like this that are basically just simple advertising pages with text and some internal links just completely fail like this so often when they get a little attention. It's just serving HTML, CSS and some JQuery (looked at the source on Google cache). If they set up even the simplest caching solution with nginx or some other reverse proxy or hosting tool it wouldn't fail so spectacularly. Unless it's doing backend requests, there are extremely simple solutions to keep sites like these up



> I'm so surprised that static sites like this

The person you replied to linked the github account and if you ckecked it you would see that it was not a static site.

I am not surprised that you decided to write a comment instead though.


This is also obvious from looking at a cached version of the site: there are there is a list of online users and posts in the sidebar.


I think the website is built with django, maybe it has some community stuff on it, which requires server side processing...

Git repo: https://github.com/widelands/widelands-website


It's probably bandwidth. You may serve static html from L1 cache and still get DoSed.


In this case, nginx is serving a 502 status. It's possible whatever stack was running that was being proxied to by nginx was killed, possibly by RAM or CPU limitations. I don't think it's a bandwidth issue here


Fair enough, for me it's just timeouts.


Or they can just go down and people don't have to freak out.

Stop recommending people join the CDN cartel


I guess people want their site to be available when it gets viral?


[flagged]


Being against censorship of even very stupid shit is absolutely in line with the old school hacker ethos.

"The internet treats censorship as damage and routes around it" was absolutely a thing.

[[shakes crutch at the kids on his lawn]]


I am generally against censorship but had to revise that stance after I realized we've given a global platform to manipulative idiots truly unworthy of global self-expression. If they want to espouse their crap they can stand on a real soapbox and shout at crowds, but nothing more


Neat, and who decides whose "crap" is unworthy of freedom of speech?

Like, my main objection with censorship isn't "I think that this particular idea is worthy of defending", it's that any mechanism that lets people censor bad opinions could also by used to censor good opinions, and that mechanism is way too powerful and prone to abuse for my comfort.


I find reading the way much of society reacted to the original suffragettes very useful for calibration purposes.

"Actually, women should be able to vote" would absolutely have been censored entirely had the establishment then had the sort of tools people are advocating to build now.


The fundamental problem is that we need some way of separating good ideas from bad, as a society. Sure, it was a big leap forward when we switched from "whatever the king says" to "let everyone shout as loud as they can and whoever shouts the loudest wins", but the truth we have to face up to is that neither method works particularly well, especially in the age of the internet.

So, is there a better system? A way we can bubble "good" ideas up and suppress "bad" ideas, ideally without some corruptible central arbiter? And, will such a system inevitably be viewed as "censorship" by some, purely on the principle that the current shouting-based system favors them more than a meritocratic one would?


Yeah yeah yeah. I know. And I stopped caring. The that deserve to be muzzled (or worse, but that's not appropriate here...) are the manipulators of all kinds and colors. If the message has any kind of CTA, particularly political, it should be suppressed.

Left, right, commerical, political, it doesn't matter. Suppress them all.


> If the message has any kind of CTA, particularly political, it should be suppressed.

This message also fits.


It ain't real life without at least a little bit of contradiction

Can't practice tolerance without being intolerant to intolerance, for example...


You've come out against idiots spreading misinformation and calls to action _in general_ in this thread. You don't want tolerance, you want orthodoxy. Intolerance of all things that are bad, tolerance of things that are good. If only the cowards at Cloudflare agreed with you!


Orthodoxy has its own issues. What I want is nothing, for people to reach their own conclusions on the merits of whatever it is they are considering, and to be left alone in the quiet of my own thoughts without having to withdraw from society completely


CDNs aren't making you look at anything. The service they offer makes websites quickly accessible to people who request them. You believe that CDNs should not offer that service to those you consider propagandists.

How does the availability of a website that you dislike affect your ability to be left alone in the quiet of your own thoughts?


The scope of this thread expanded beyond CDNs several comments back.


Just so I am clear, is "CTA" here "call-to-action". And you're against anything with a call-to-action in it?

Like, call me out if this is the wrong acronym, or if I'm misinterpreting, but wouldn't all these be banned then?

> Get out and vote!

> Get vaccinated!

> Sign your kids up for school by August 15th!

> Read to your kids!

> File your taxes by April 15th!

Like, is that actually your position? Not trying to rebut (yet), but that seems like a very very different standard than what exists today if so.


Pretty much. People are being pulled in a thousand different directions by people with their calls-to-action.

Mainly targeting advertisements and political messaging. More prosaic stuff like filing your taxes on time can be conveyed effectively without commanding people to do things. "Taxes are due by April 15th" would work fine, for example.


> People are being pulled in a thousand different directions by people with their calls-to-action.

Is that a bad thing?

> "Taxes are due by April 15th" would work fine, for example.

"Climate action is due now"


> "Climate action is due now"

Couple that with some of the evidence as to why and you have a pretty good message to get out.

So very many people don't like being told what to do, and all we do is yell at them with commands and imperatives. Personally it has given me a burning contempt for the kind of folks that promulgate this stuff. If someone wants to compel me into action then they need to convince me on its merits, and those merits alone.


So you are not calling to suppress all political calls. You are calling for reasoning. I agree here.


life is political. So you would have to suppress all communication about life. Show me content and I will show you politics. A video of children playing in a pool: Are they dressed appropiately? Are they playing the right games? Are their surroundings a capitalistic dream without meaning? What about a b-actor soap series? a wikihow on how to make beer? a podcast on how to make vegan yoghurt? an article about curing meat? There are no things unpolitical. life is political.


No, politics and social mores inserts itselves into these situations. I can guarantee you that those kids are thinking about none of those things.

And what exactly makes politics mandatory in an article about brewing beer? Curing meat? If people stick to the essence of what they are trying to convey without getting lost in political rhetoric people can, in fact, communicate clearly. But you have to stick to the matter at hand.


>"The internet treats censorship as damage and routes around it" was absolutely a thing.

It was actually never a thing. The internet has always been censored, everywhere, all the time.

People just think it wasn't because sometimes it let them be edgy racist shitlords and they confused that with unfettered freedom of speech. Still do, unfortunately.


Which CDN is against censorship? And don't say Cloudflare.


I'm not sure what site you're referring to, and that proves my following point....

There are multiple CDN providers. Unless they're all doing this, OP doesn't have to go with the specific one you're calling out here and can still reap the benefits.


They didn't, though. They recommended a few things that this site would do to how it's set up within its server. (I mean, a CDN would also help, but...)


CDN cartel? Why such a negative take on CDN? Genuinely curious (I know little about CDNs)


cdns are in a unique position (besides google analytics, facebook pixel, webchat services, etc), to track you across the internet and different domains.


Further, they are proprietary, non-open, they insert their branding into your site, and pass the blame for errors to their customers when they fail. They abuse an oligopoly position and obfuscate the addressing details of sites under their protection


The default option is to just front it with Cloudflare.


Not only is it a "default option" that is making the web less decentralized, it is also way overkill for something as simple as this. Make sure nginx serves cached content (right headers) instead of reading the content from disk and maybe, just maybe, throw in a static cache like varnish in front.

For the times my websites hit the frontpage of HN, a simple nginx instance (without any cache in front like Varnish) on a $5/month Digital Ocean server was enough to handle things.


You don't even need nginx; I've got a single Node process on a Heroku "hobby" node serving my site, and it's weathered several front-page visits peaking at 15 requests per second without problems. The important thing is the static-rendering (assuming it applies for your site's content)


Having a blog post of mine get posted to HN (many years ago) was how I discovered I'd accidentally introduced a bug into my web server config that meant the app server behind it was serving the static assets as well as the dynamic stuff. It entered a state of wedgitude quite rapidly when traffic ramped up.


>that is making the web less decentralized

Sadly I don't see any alternative. The need for CDNs is a direct result of the structure of the internet and modern HTTP. We could imagine alternative infrastructures - decentralized transparent network-layer caching so that the network itself caches data and responds to requests with cached results - but the end-to-end structure of TCP and HTTPS make that impossible, for better or for worse. So we have to use CDNs.



> making the web less decentralized

In this case, this is not a virtue. Using a CDN is option, reversible and is likely to spread your content out to be much less centralised than anything hand rolled.


Except when 20% of the web does that the web as a whole become less decentralized, while your files might still be further spread apart in the world.

What I was thinking about was the web as a whole, not my specific files.


Course it's decentralised. Decentralised is about technologies, not about which provider you pick to run your decentralised technologies for you.


The best way is to use GitHub Pages + CloudFlare. This way you can also use site generator like Hugo that runs by Github Actions and you can even deploy some code to CloudFlare Actions.

Here how I setup this for OSS game engine I work on:

https://github.com/vcmi/VCMI.eu/


Github pages already has a CDN in front of it, but I like your workflow with Hugo.

The widelands site seems to go directly to netcup in Germany. It is likely just a DDoS detector mistaking sudden interest as an attack.


I believe Cloudflare in his case isn't acting as a CDN (though it's still is) but a way to use his own domain (vcmi.eu)


While I mostly use CloudFlare as DNS manager it's absolutely not required to use own domain on GitHub Pages.

https://docs.github.com/en/pages/configuring-a-custom-domain...


Yeah I do that. I used Pelican and it works well.


I can't bring myself to believe that an external caching layer is absolutely required to handle the load from something like a hacker news front page. WordPress with no caching I can see being taken down by 10 to 20 requests per second, but any kind of local caching should be able to handle the load easily. Does HN produce more traffic than that?


I've been on the frontpage before and my logs indicate that the peak load was somewhere north of 350 requests/sec though the non-peak (while still on the frontpage) was typically 10-20 requests/second. That is for a very simple page consisting of an html document, css, an image, a favicon and no javascript (eg https://calpaterson.com/printers.html). If you can serve from a reverse proxy cache locally that will be fine but if some PHP/Python/NodeJS is running on each request you can run into trouble with the small servers people typically use for their side-project's website.


The one time I was involved in something on the top, I remember about 400 concurrent users according to google analytics.

Our cheap server had no problem to serve the static site.


Cloudflare is awful, I get tons of captchas while browsing with tor.


You were wrong technically but you have a point. These sites should leverage existing services like reddit or AWS.

Personally I'd put a static information site on AWS and a community on Reddit if I was the creator for something like this game.

It's just like SMB's today don't really need much infrastructure, well open source projects shouldn't either.


That's a horrible idea. Yes, open source should definitely get off closed and abusive platforms like Reddit or AWS. Really, the open source community should start dog fooding more. It seems like yesterday all the communities were on IRC and it was easy to find everyone as long as you were connected to Freenode.

Now everyone is using Slack, Discord or other closed-sourced systems that are not even built for the use case of "open by default".

It's really sad to see, especially when a lot of developers in the ecosystem are cheering on projects moving to platforms like reddit or AWS.

To run your own infrastructure is not as hard as you think, and it gets a lot cheaper too.


I completely get where you're coming from, I'm a staunch self hoster myself.

But why re-invent the wheel before you have to? That's just a waste of time and resources that could be used to promote and build your project.

Like in this very case, the website is now down and unable to promote the project because it was made in a traditional way.

Imho it's enough to retain ownership of domains to really own a project. Like the widelands.org domain.

So whatever happens with AWS, or reddit, that domain can always move and point visitors to new communities.

In my personal selfhosting environment I have my own domain pointed to protonmail. So even if protonmail goes belly up I can always move my domain somewhere else.

And even my domain is picked to give me maximum control. I use my own country's ccTLD because they have a good conflict resolution department that will benefit me as a citizen.


Even running a site that front-pages HN doesn’t actually require that much infrastructure. Peak traffic is a few hundred RPS. Or at least it was when I was still using google analytics. Might be more now. But in any case, if you use fast web software you can serve dynamic content to an HN ddos off a $5/mo VPS. The problem is a lot of people will be like “well my website doesn’t need to be fast because it’s not popular, so I’ll write it in python/node/ruby and make it dynamic” and then their server falls over at 50 RPS on commodity hardware, when if they’d written it with a popular haskell/rust/go/whatever framework, it could handle 100x that much traffic.

If you hope that your writing ever reaches a large audience, you need to be able to serve a large audience! It doesn’t take that much extra work to be ready for it.


I think your getting downvoted because of the python/node/ruby vs. haskell/rust/go comment and perhaps rightly so. For some things this will certainly matter, but for a public content like this, especially the landing page, the real problem is that a flood of visitors should not all be hitting any dynamic backend in the first place - they are all going to see exactly the same page anyway. That means either making it completely static (would mean removing the sidebar in this case) or having a modest (micro-)caching setup so that most requests don't hit the backend. With nginx (which is what widelands.org seems to be using) this is only a couple of lines of configuration, no need to change the dynamic backend at all.


Needing to use a reverse proxy like nginx in the first place is honestly insane, and the fact that many web devs tolerate it is a sign of totally whack priorities. You shouldn’t need a reverse proxy to serve your website, and you should be able to serve dynamic content to hundreds or thousands of people per second. Literally all you have to do for 90% of websites is not use some slow uncompiled backend that shits the bed at 50 RPS. Pointing this out pisses people off because they hate the idea that bogosort.js isn’t actually a good web backend and they might have to learn something else if they want to build a good website instead of trying to paper over perf issues with a massive reverse proxy with 35,000 config options and just as many CVEs.


People really seem to underestimate just how much a CPU can handle when it’s not bogged down by the abstraction of high level languages. You can easily easily serve over 1000 static pages per second per core with nginx on commodity hardware over TLS, and over 10000 pages/second/core if you’re using a server-grade CPU like a Xeon or Epyc.


I don’t think “high level” is even the problem. Haskell is extremely high-level and yet it’s very fast, on account of having an extremely OP compiler. In this case it seems to just be the frameworks on top of non-compiled languages that are slow.


The forum's a more complicated question but for the static information site for an open source project that's already hosting its code on github, configuring a custom domain for the github pages site would seem even simpler than sticking it on AWS.


generally yes, but their site is also a forum (and some other things). Looks like its made with django


They could still make the landing page static or at least enable some (micro)caching - there is no reason for every request to hit the backend. Of course, this is obvious only in hindsight.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: