The philosophy that was not being understood was "move fast and break things." "Slow is smooth, and smooth is fast" was mentioned as an opposite point of view.
To then explain "slow and smooth, and smooth is fast" as a reply is to not comprehend the comment at all. Then, it ends with a link to their own blog.
the top level comment is fine. the lame guy's comment was a promotional chatgpt-generated useless tl;dr that added zero information and linked to his own blog post
This whole thread is trainwreck. Your initial comment is three simple sentences with very little room for misunderstanding yet here we are. Then there is a comment on that comment which is self-promotion of LLM-trash published as blog post. One would think should an easy donwvote, but it is not. Then, a dude who pointed out this lame self-promotion is donwvoted into oblivion, because what? Bunch of people cannot think of three seconds and use their eyes to try to understand what's lame about that?
"this" doesn't indicate which one it's referring to. Obviously they understand the effects of "move fast and break things", so it makes sense it would refer to the other one. Doubly so they quoted one but not the other, which is often done in contexts like this to indicate you're repeating it verbatim because you don't understand it well enough to paraphrase.
In that case, scrolling down, the other replies don't get it quite right either. An alternate way of phrasing that one would be "innovation over stability/perfectionism". It came from Facebook, where users can tolerate some minor breakage, in an era when they were cranking out all sorts features and overtaking MySpace. I think the idea is generally understood to be a good thing in the startup stage where the goal is to disrupt existing competition - if you take too long to get to market, whatever you're doing might not matter anymore.
Well, I mean (a) it is unwise to accumulate a big pile of paper money (or the equivalent in a bank), like a common Disney duck plutocrat; if you do this, you will in the long run lose money in real terms. By design. Money (or at least modern money; things were somewhat different in persistently-deflationary pre-industrial economies) is not designed for investment. Money should not be the comparison; it’s a tool, not an investment vehicle.
But (b) money is given some real grounding by states; you can pay tax with it, in particular. It’s not much, and it’s not bulletproof, but by comparison to cryptocurrency, well, it’s _something_.
But really, it makes no sense as an argument that hoarding cryptocurrency is a good idea, because hoarding _money_ is, notoriously, a _bad_ idea.
Modern money has operated as you describe for less than one human lifetime. It might not work long-term, and indeed there are significant reasons for worry that it won't. Not that it _can't_ in principle, but that it _won't_ due to mismanagement and misaligned political incentives. A future alternate system might well have deflationary characteristics.
If you mean Bretton Woods, that is in many ways an implementation detail. Ignoring the Great Depression (which probably _should_ be treated as a special case), and brief (~1 year) shocks after WW1 and 2, the last time the US, to take an example, saw _persistent_ deflation was the late 19th century.
> Gold has its uses. It’s a pretty metal and has practical uses in engineering and medicine.
Would you say gold’s value is determined mostly by its practical use? Because it seems demand is largely driven by speculation as an alternative currency.
I don't see who is refusing to sell you a phone with various attachments that make a workable computer (especially given 3rd party providers). I just bought such with an iPad. You could do something along the same lines with an iPhone I suppose but I'd probably find it unsatisfactory. Still on the fence whether the iPad is still a fully satisfactory companion device for traveling in general.
In the Apple ecosystem, the one “refusing to sell you…” is Apple, and not only the required peripherals but also not making the software work that way. Because if they did, then you wouldn’t need to buy a Mac.
Macs are not especially relevant to Apple revenue at this point. iPad plus magnetic keyboard is getting pretty close. Still not sure that better multitasking would push me over the edge to not wanting a MacBook for serious day-to-day multitasking work.
I do think they're trending in that direction but I actually like that they're not pushing people faster than feels comfortable. I expect to see a convergence of iPads and at least MacBook Airs but we're not quite there yet.
I think it's the right call since there isn't much competition in GPU industry anyway. Sure, Intel is far behind. But they need to start somewhere in order to break ground.
Strictly speaking strategically, my intuition is that they will learn from this, course correct and then would start making progress.
The idea of another competitive GPU manufacturer is nice. But it is hard to bring into existence. Intel is not in a position to invest lots of money and sustained effort into products for which the market is captured and controlled by a much bigger and more competent company on top of its game. Not even AMD can get more market share, and they are much more competent in the GPU technology. Unless NVIDIA and AMD make serious mistakes, Intel GPUs will remain a 3rd rate product.
> "They need to start somewhere in order to break ground"
Intel has big problems and it's not clear they should occupy themselves with this. They should stabilize, and the most plausible way to do that is to cut the weak parts, and get back to what they were good at - performant secure x86_64 CPUs, maybe some new innovative CPUs with low consumption, maybe memory/solid state drives.
That's a very low margin and cyclical market since memory/SSDs are basically commodities. I don't think Intel would have any chance surviving in such a market they just have way to much bloat/R&D spending. Which is not a bad thing as long as you can produce better than products than the competition.
Yeah, but they can saturate fabs and provide income, which they need. Intel can't produce better CPU/GPU products than their competition now. Their design and manufacturing of CPUs has serious problems for years now. The big money in GPUs is already captured by NVIDIA, and it's hard to see how Intel can challenge that - people want NVIDIA and CUDA. So Intel should cut down and focus the remaining bloat and R&D spending on the areas where it's plausible they can get competitive in a reasonable time. That is CPUs, and maybe memory and SSDs - they have X-Point which was great, just marketed and priced wrong.
If you happen to restart one of the instances that was hanging in the infinite thread, you can wait a very long time until the Java container actually decides to kill itself because it did not finish its graceful shutdown within the alotted timeout period. Some Java containers have a default of 300s for this.
In this circumstance kill -9 is faster by a lot ;)
Also we had circumstances where the affected Java container did not stop even if the timeout was reached because the misbehaving thread did consume the whole cpu and none was left for the supervisor thread. Then you can only kill the host process of the JVM.
I don't agree. K&R 2nd edition only goes up to C89, while C99 added tons of language changes which make C a much friendlier (and dare I say: safer) language (the two biggies are designated-init and compound-literals, but also tons of smaller things, like declaring variables anywhere, "for (int...", a proper bool type, fixed width integer types (uint8_t, ...) etc etc...) - I would even go as far to say that C99 is the most important C version and almost feels like a new language compared to C89 (after C99 the changes were mostly incremental though).
The 90s were the decade when C saw its biggest improvements, and K&R 2nd edition stops just short of that.
(the book is still a good read as an interesting historical artifact - it contains a lot of wisdom that goes beyond language details - but as a language reference or learning resource it is hopelessly outdated)
C17 is the best version to target. It is a bugfix version of C11, so the feature set is older and more widely available than the date suggests. C11 includes atomics which are necessary for multicore programming.
It's another way of doing things and not necessarily incompetence.
reply