Over the last few years, I've found that most sites clients want can be built with static site generators and JavaScript. PHP is also great and easily hosted! But most times when there's a sprinkling of dynamism needed, it's OK if its happens at build/run time rather than when the page is rendered on a seever. This leads to faster page load and less to worry about security-wise. No shade! I've just been finding this has lead to good outcomes for me.
You mean to say some basic company site, blog or photo gallery that only gets updated once/twice a month, with zero dynamic content otherwise doesnt need a whole LAMP stack?
Honestly though with GH/CF pages type hosting and how simple static sites can be its a direction I'm ever thankful things have been moving. Just seems so much less painful for those who arent here to be security experts and just want a bloody site that 'just works'
Your static site generator can generate PHP instead of html and have some server-side dynamism sprinkled in your mostly static site, same way that generating JS can sprinkle some client side dynamism.
No clue how relevant they are today, but server side includes (SSI) solves the problem of wanting a _mostly_ static page with a little bit of dynamic content in it.
I wrote up notes here https://news.ycombinator.com/item?id=45180555 but this wasn't coordinated. As I mentioned in the first paragraph of the post, I was following a thread on Bluesky where Adrian and Jeff were chatting. I posted the article there. I didn't know Jeff had posted on Hacker News about this already, and I didn't know he would post my piece. I definitely would have done another editing pass if I had known!
Author here. I woke up to a surprising amount of traffic! Some notes based on the discussion.
This wasn't coordinated between Jeff Geerling and myself. However, I did mention the post in the Bluesky thread that Jeff was included in. [0]
I concluded the piece with “[t]his space is ripe for disruption”. That was a really poor choice of words. I've since updated the piece to better match what I was trying to say. Diffs are available. [1]
On YouTube: as I mention in the piece, I think the service is excellent as a consumer, and I pay for Premium.
This piece was mostly written because I've been frustrated that YouTube is effectively the only place for user submitted video on the internet. I wasn't going to write anything until I saw the video from RedLetterMedia that I mentioned in the post. They have a huge following and were blaming something that might be related? Or might not? It's really hard to tell! I'm not a YouTube creator, but I assume having metrics that determine your livelihood shift out from under you as a creator must feel awful.
> On YouTube: as I mention in the piece, I think the service is excellent as a consumer, and I pay for Premium
Why? Because the tools that allow them to take almost 50% of the revenue (they say you earn) have low friction?
I would say the opposite. There is no customer service. There are endless legal pit traps that allow larger channels and companies to predate on smaller ones alongside the AI channels, which lead to the same end. The entire point of the platform is to push as much advertising as possible, while mutating a user's search habits. Ironically, this leads to videos becoming borderline useless for many use cases, without taking them off youtube. This is not a good platform.
I'm sure I feel this way because I don't have a bunch of content I'm afraid of being yanked from the platform. Another "benefit" of having a big youtube presence, is I would be forever worried about implied retaliation.
I read it as they're enabled to feel that way--and express it publicly--because their digital life and livelihood is not held hostage to the capricious monopoly.
I did implicate that Youtube has monopolized the market, allowing a lower bar of service to become the norm. This latest move, seems to make every aspect of youtube's value proposition worse.
They said an LTT store message directed them to the Brodie Robertson video https://www.youtube.com/watch?v=1hVwUjcsl6s so they did their own investigation which confirmed similar things.
It looks like Youtube might be measuring views differently and perhaps getting rid of unmonetizable views which doesn't impact the number of likes or revenue. I think the annoyance is over the lack of transparency and the power Youtube holds over content creators rather than any immediate concern over loss of income etc.
> rather than any immediate concern over loss of income etc.
I don't know if that's necessarily true. Apparently there's not a significant loss of revenue _from YouTube_ from the reports of these creators. But some sponsor deals might be structured based on CPM, and so a suddenly decreased view count could have a direct revenue impact from those sponsorship deals.
They probably would prefer zero third party sponsors, because adding sponsored content dilutes the value of the on-platform ads. Features like “commonly skipped section” and the timeline view intensity histogram reduce the value of sponsorships.
But if they eliminated sponsors, creator revenue would drop significantly and so would content production.
The nuclear option would be to require all sponsored segments to register with YouTube. That would give YouTube way more control and dramatically reduce creators’ business flexibility (how do you tax a donated 3d printer?).
The Wan Show is very long and waffley and strictly for fans. LTT clip segments of the show but the relevant segment is still nearly 40 minutes long https://www.youtube.com/watch?v=9JJ8dur6unc
I host videos on my own server and there's Vimeo and Mux. I guess you're saying it's the free-as-in-beer service that has a social network and recommendation network attached to uploaded videos.
Mux is new to me. Looks like a video-first headless CMS with some neat AI integrations.
Vimeo does have monetization tools [1] but they’re focused on direct sales.
YouTube is just way ahead… even if you ignore the ads platform, a YouTube premium subscription gives you WAY more ad free content than a Vimeo purchase or Floatplane/Nebula subscription.
> This piece was mostly written because I've been frustrated that YouTube is effectively the only place for user submitted video on the internet.
I realized this back in 2009 and tried really hard to start using other platforms, but wound up just not watching YouTube as often instead. I hope this changes. The only true competitors are places like TikTok and Instagram, but they don't feel like a true replacement to the rest of us who don't want to be tied to "social media" but YouTube shorts are evidence that it does compete with YouTube directly.
I think YouTube even tried to have "IG Stories" at one point iirc.
One of the things that is notable about Youtube is there was once competition (Vimeo and Daily Motion) but they effectively outdistanced it. A bit like Amazon and Ebay. There are related things semi-competing like Twitch.TV etc, also, of course.
I suspect that the situation with the earlier video providers is that they were "bleeding cash" for many years until the process finally reversed - if they were the winner (again like Amazon).
I think this long capital investment process is what means that no one wants to or expects to step into the ring with a large, successful player. It took that player a long time to learn to be successful, that player will fight you to keep their relative monopoly and you will have to risk a lot of money.
Youtube content creators are effectively Youtube's suppliers. Youtube is squeezing and its "normal" - squeezing suppliers is part of the monopolist's playbook. Its unfortunately convenient for Youtube that people have been willing to make good quality video for nearly nothing since the tools to do so became cheaply available.
Why there is "no competition" for Nvidia, Amazon, Youtube, etc. Not that I like the situation but it's not an "unnatural" situation.
Structurally there's only a few ways disruption can happen to a platform that has existing centralized hosting of metadata and centralized hosting of data. Either the disruptor also centralizes both, decentralizes just the data or decentralizes both.
The second isn't viable in most real world cases until something changes the huge expense of decentralized CDN fetching. My gut says that the third would be on the losing side of almost every network effect.
> This piece was mostly written because I've been frustrated that YouTube is effectively the only place for user submitted video on the internet.
Well, technically there's lots of user submitted videos posted to p*rn sites... Apparently even started posting educational videos there, like math and neural networks and stuff.
I ditched Google Analytics on my blog because it more than doubled the size of any page load (the “Universal” version loaded a lot of additional JS). This was fine for a while, as I wasn’t posting much. Later, I wanted to write more. This might be shallow, but I found having numbers associated with my writing helped me keep at it. I also found it fun and useful to see what got traction. I tried Plausible for a while, but it was overkill for my needs, and I grew out of the first tier quickly. I then moved to Tinylytics [0], and have been enjoying that.
I'm guilty of having fallen off the RSS bandwagon as Twitter grew and Google Reader was killed off. But over the past few years, I've gotten back into the groove of curating a list of small blogs in an RSS feed. Catching up on my feed is such a nice way to start the morning. It's always great to see a new piece show up from a writer you like, but who posts infrequently. The fracturing of social media made me want to post more on my own site, and I'm seeing more posts from others who feel the same. I'm hoping this trend continues.
14kB is a stretch goal, though trying to stick to the first 10 packets is a cool idea. A project I like that focuses on page size is 512kb.club [1] which is like a golf score for your site’s page size. My site [2] came in just over 71k when I measured before getting added (for all assets). This project also introduced me to Cloudflare Radar [3] which includes a great tool for site analysis/page sizing, but is mainly a general dashboard for the internet.
Text, yes. Graphics? SVGs are not as small as people think especially if they're any more complex than basic shapes, and there are plenty of things that simply cannot be represented as vector graphics anyway.
It's fair to prefer text-only pages, but the "and graphics" is quite unrealistic in my opinion.
How much is gained by using SVG (as opposed to a raster graphics format) varies a lot depending on the content. For some files (even with complex shape paths depending on a couple details) it can be an enormous gain, and for some files it can indeed be disappointing.
That being said, while raw SVG suffers in that respect from the verbosity of the format (being XML-based and designed so as to be humanly readable and editable as text), it would be unfair to compare, for the purpose of HTTP transmission, the size of the raster format image (heavily compressed) with the size of the SVG file (uncompressed) as one would if it were for desktop use. SVG tends to lend itself very well to compressed transmission, even with high-performance compression algorithms like brotli (which is supported by all relevant browsers and lots of HTTP servers), and you can use pre-compressed files (e.g. for nginx with the module ngx_brotli) so that the server doesn't have to handle compression ad hoc.
If you want a fancy syntax highlighter for code blocks with multiple languages on your website, that is alone about that size. E.g. regex rules and the regex engine.
Why do I care about fonts? Honestly, if my browser had an option not to load fonts and use my default to save load time I ld choose that 19 out 20 times.
512kb is pretty achievable for personal websites. My next target is to stay within 99kb (100kb as the ceiling). Should be pretty trivial on a few weekends. My website is in the Orange on 512kb.
{"error":true,"name":"Error","message":"css should not be empty","stack":"Error: css should not be empty\n at module.exports (/usr/src/app/node_modules/penthouse/lib/index.js:206:14)\n at file:///usr/src/app/server.js:153:31\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)"}
It changed with the introduction of HTML5. It was one of those ideas that’s great for spec nerds, but flew in the face of previous standards compliance. It’s something I used on a lot of sites in the 2010s, but that I no longer have access to change. I’m betting this will make some old sites built by standards fiends look weird in spots, but not break things too badly.
If you use GitHub Pages, you can then use the built-in local VS Code (“github.dev editor”, which is separate from Codespaces) to create and edit posts. I wrote about it here: https://anderegg.ca/2024/10/16/using-github-as-a-cms
I ended up using this on my iPad for the posts I wrote over the Christmas break.
reply