Hacker Newsnew | past | comments | ask | show | jobs | submit | palata's commentslogin

If OnlyFans is not prostitution (and in my understanding it is not), then I don't see why it is surprising?

I genuinely wonder what "starting to become usable" means there. Is it the first time that anyone can load a simple static page? Or is it good enough to browse most of the web, but some features are missing like... I don't know WebRTC or passkeys?

A simple static page is not equal to 'starting to become usable', that'd be 'passes the first experimental test'. By 'starting to become usable' I mean Nextcloud and all related apps I've tested work as intended and I'm posting this message from within servoshell. I've kept track of its progress just like I did back in the time of its spiritual predecessor - Mozilla, then Phoenix which became Firebird which became Firefox - and was until recently not able to get it to show much more than that simple static page you mentioned. While it may have passed a bunch of verification suites it - servoshell if not the Servo rendering engine itself - it was not usable for much more than showing that static page. Now, it is. Its usability resembles that of early, pre 1.0-versions of Firefox. I used those just like many others did, warts and all, and reported and where possible fixed bugs.

Why now post this for Servo and servoshell? Because it seems to be close to the same state as Firefox was when it became somewhat usable and with that it became possible for others to start using it and report any bugs and hiccups. The more people do this, the fast the progress will be. A usable alternative browser engine and with that usable alternative browsers is a big, BIG thing and worth reporting as well as supporting in its progress.


This is one of those companies whose website you can visit, and after 10 minutes still have absolutely no clue what they are doing. Other than making the news with a toxic CEO.

I am convinced it's actually what gets them to those dominant positions. They are selected for that, because those who select them are either the same kind of assholes, or because somehow we as a society believe that those are the "strong leaders" we need.

I would like to know which fraction of CEOs are not toxic assholes. Honestly it seems like a requirement for the job is: "you're the kind of person who needs power to have friends".

When you lose 80% of your staff, you failed as a CEO. And when you do it like this... you deserve to be absolutely disrespected by your staff (old and new).

What would have happened had they not fired them? Maybe some loss of profitability (and importantly, maybe, or probably not), but probably not to the point where 80% of the staff would have had to be fired. Also it has evolved a lot between 2023 and now, and employees who were skeptical back then may have started adopted LLMs where it made sense.


I think it does both: you can have an LLM automate bad coding (that's the vibe coding part), and you can have an LLM speed up good coding.

Many times, bad code is sufficient. Actually too many times: IMHO that is the reason why the software industry produces lower quality software every year. Bad products are often more profitable than good products. But it's not always for making bad products: sometimes it's totally fine to vibe code a proof or concept or prototype, I would say.

Other times, we really need stable and maintainable code. I don't think we can or want to vibe code that.

LLMs make low-quality coding more accessible, but I don't think they remove the need for high-quality coding. Before LLMs, the fraction of low-quality code was growing already, just because it was already profitable.

An analogy could be buildings: everybody can build a bench that "does the job". Maybe that bench will be broken in 2 months, but right now it works; people can sit on it. But not everybody can build a dam. And if you risk going to jail if your dam collapses, that's a good incentive for not vibe coding it.


I personally choose not to depend on more wrappers. If I need to clone a git repo, I `git clone` it. Then I build and run the project using the build system of the project.

If the project properly uses a build system I am familiar with, then I don't really need to think. If the project does something exotic, then chances are that I will just give up on that project. But I don't think that your tool would help here: if it is exotic, then your tool probably won't know how to automate it.


That’s a totally fair take and I agree with you more than it might sound.

I’m not trying to replace cloning or proper build systems, and I don’t expect this to handle exotic setups. If a repo has a custom toolchain and good docs, I’ll still clone it locally. The problem I keep running into is before that point: when I’m skimming 10–20 repos to decide which ones are even worth the effort. A surprising number either don’t run anymore, depend on unstated versions, or silently assume a local setup that isn’t obvious from the README.

For me, even a fast failure with a clear reason (“missing env var”, “custom toolchain”, “expects GPU”, etc.) is useful, it tells me whether to invest time or move on, without polluting my machine or context-switching into setup mode.

So I think of this less as a wrapper around build systems and more as a disposable “is this repo alive?” check — something you use before you decide it’s worth cloning.

That said, I’m genuinely curious: when you give up on an exotic repo today, is it because the setup is unclear, or because you’ve already decided it’s not worth the effort? That distinction is what I’m trying to understand better.


> when you give up on an exotic repo today, is it because the setup is unclear, or because you’ve already decided it’s not worth the effort

Depending on a third-party adds risk. And when I depend on a third-party, I need to convince myself that I could either easily replace it or fork it and take over its maintenance. Because I am responsible for the software that I ship, and third parties are part of it.

If it has an exotic setup for no apparent reason, that's already a red flag. If they need to wrap a CMakeLists into a Makefile into a Python script into a shell script, maybe that is not the kind of software quality I want to depend on.

In other words, if I can't easily build the project, it's probably not good enough for me to depend on it.


I am a Linux fanboy and I totally agree that I am almost always near a plug and don't need that kind of battery life.

But when I can go for days on my work Macbook without charging (and I am a developer, so I do compile stuff), I kinda wish I could have that on Linux, too.

And again, I don't need it. Just like I don't need a fast Internet connection, but well... :-).


I am a Linux fanboy. But man, when you try the battery life on the latest Macbooks... it can last for days of work without charging.

> What I honestly find more baffling is that they thought the Vision Pro would sell well.

Those monopolies seem so scared to "miss the next smartphone" that they invest heavily in whatever their competitors do. Everybody was running after VR/AR headsets, now everybody is running after AI.

They see the others run somewhere, they run in that direction. Just in case.


> Those monopolies seem so scared to "miss the next smartphone" that they invest heavily in whatever their competitors do.

Monopolies so scared of the competition?


Yes, because of inventors dilemma.

That is how Kodak lost digital photography, Microsoft tablets and phones even though it had them for a decade before the competition, and so on.

Monopolies double down on what they know that prints money, and are averse to taking any risks.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: