I was a very early customer of Roomba and loved them when they came out. I had pets at the time, and the machine would consistently fail in about 14 months. I finally figured out that I needed to buy them from Costco, so that I could get them replaced.
Rather than taking their lead and improving the product, they just sat there with the exact same product for like 10+ years. It was outrageous.
I guess Rodney Brooks got busy with other interests, and whomever ran things didn't realize that Tim Ferris is full of shit.
It was extremely frustrating to watch these assholes destroy the company right from the outset. All they needed to do, was to slowly walk forward and iterate with improvements.
The only surprise in this news is that it took SO LONG for them to dismantle the company.
I do not think it's appropriate for an organization holding this much deeply personal data can be sold to any foreign entity.
Rather than taking their lead and improving the product, they just sat there with the exact same product for like 10+ years. It was outrageous.
I think that this is actually the only viable strategy for a hardware product company in the current world.
As soon as your product is successful, it will be cloned by dozens of Chinese companies and dumped on the market everywhere. Any update you make from there on out will immediately be folded into all those products selling for 10% what you do. In a couple years, they'll all be better than yours, and still way cheaper.
So you have to do the Roomba thing or the GoPro thing, where you iterate behind the scenes until your thing is amazing, release it with a big Hollywood launch, get it turned into the noun and verb for your product category and the action that it does.
But then you have to do what those companies didn't do: Fire everybody and rake in as much cash as possible before the inevitable flood of clones drowns you.
I have a few really good hardware ideas, but I don't believe I could ever market them fast enough and far enough to make it worth spending the R&D to make them happen.
> I have a few really good hardware ideas, but I don't believe I could ever market them fast enough and far enough to make it worth spending the R&D to make them happen.
Yeah, almost a decade ago I had a dream of creating a drone startup with some very specific tech that would have required several years of R&D to create. The end product would have been relatively cheap to manufacture, being basically a PCB with a large FPGA plus a bunch of relatively cheap sensors.
I actually got about 6 months into the project, and then realised that although it was a great project and if it worked well, I'd be able to make units for about 25% of a viable RRP and be able to recoup all my time doing R&D without an income with maybe 5k units in direct sales. And then it slowly dawned on me that if I could build it for 25% of a viable RRP, then the Chinese cloners could do it even cheaper, and all they'd have to do was reverse engineer the protection on the FPGA bitstream to clone it and clone a pretty simple PCB. At the time, the drone market was full of cloned components for a fraction of the price of the original price, or of open source projects sold for half the price of the official boards to support the project.
In such a situation, the only way to really survive is to innovate faster than the cloners can copy it, but that's kind of predicated on making a product that you know isn't what you want the final product to be from the start, so that you can drip feed the improvements into the market every time the previous version was cloned. That would also have the side effect of alienating the early adopters, as well as making new customers wonder how long it'd be before the new product was obsolete. Ultimately, I decided that realistically it wasn't viable to continue doing R&D for another couple of years, unsure if I'd actually be able to pay myself going forward.
> If society no longer values these qualities, then we don't deserve better.
Isn't it more like "if society has time to think about and can afford those qualities"?
If most folks out there have limited finances (CoL-relative, of course) and are just scrapping by, they'll buy the cheapest thing out there that just does the job (vacuums) and tend to ignore any extra luxuries, even if those would be more economically advantageous long-term (repairs/maintenance part of the TCO). That's simply because of the focus - it's more on the account balance, due bills and next paycheck, than on the implications for a more distant future. Crazy volatility and all the global rollercoasters like pandemics, wars, and all the crazy politicians around the world doesn't help regular folks' sensible decision-making at all, of course. The more stressed one is, the less rational they act.
People don't buy cheap junk because they don't value quality. They buy it primarily because of affordability reasons, or because their focus is forced to be elsewhere.
>People don't buy cheap junk because they don't value quality. They buy it primarily because of affordability reasons, or because their focus is forced to be elsewhere.
The focus, thanks to years of advertising, is shifted towards features, new features sell, quality doesn’t, so to keep the price point and “innovate” manufacturers need to lower the quality knowing that a new version will replace the device soon, most consumers see this as normal so when a poorly designed and cheaply made thing for what there’s no replacement parts , no repair info, no software/ firmware fails is just an excuse to purchase the new shiny iteration with all the the bells and whistles (and AI!, copilot toaster!) wich is gonna last les than the previous one but now needs an “app” an activation and a subscription for the premium features .
Having done the hardware game, it's not so much the clones that get you, it's the VCs/shareholders.
You need a lot of money to make hardware, so you get vc money and eventually shareholder money. But if you're not selling new hardware all the time, the company isn't making money. So they dictate that you need to make new hardware, yearly.
Making new hardware yearly is enough of an undertaking that you no longer have time to iterate on the software that could enable new features. And often hardware iterations aren't going to change that much, it's hard to "invent" new hardware. It's better to make a hardware platform that enables new exciting features, and iterate on the software. But that isn't going to sell yearly.
So unless you have a software subscription model that people love, every hardware company tends to stagnate because they are too busy making hardware yearly to make "better" products.
You see this very clearly in cameras vs phones. The camera companies are still making cameras yearly but none of them incorporate the software features that have led phones to outpace them. A lot of phones with so so cameras take better pictures (to the average eye) than actual cameras because the software features enhance the photos.
I worked on firmware for such a "noun and verb" product that IPOd a decade ago, and lived the struggle realtime.
This isn't unique to China, it's just the nature of modern manufacturing. The only reason China stands out is because we offshored our manu there, so it's where we see it happen.
I feel like people forget that the entire purpose of factories/ automation/ modern manufacturing was to divorce human skill from product worth (so that companies wouldn't have to pay workers based on skill). That also means that in the realm of physical goods, "moats" are not maintainable unless you have a manufacturing technique or technology that others don't. Since companies rarely create their own production line machinery, anyone else who can afford the same machines can produce the same products.
The actual "viable strategy for hardware companies" has to be about market penetration; make products that aren't on Amazon, for example, and Amazon can't be used to out-maneuver you. Firearms are a great example of where manufacturing capability does not equal competitiveness; China can absolutely produce any firearm that you can buy in the US, but they don't because other factors (mostly related to regulatory controls) created a moat for manufacturers. Vehicles are another good example. Good luck buying an Avatr car in the US.
But yes, if you plan to make a vacuum, which is just you iterating on what others have done as well, you should probably expect that people are going to trivially iterate on your variant too.
While those patents are not enforceable in China (unless equivalents were also filed in China -- unsure if they would be worth much) they would be when imported to the US. This is one of the reasons the ITC exists, and it played a prominent role during the smartphone patent wars. So at least the US market would be protected from knock-offs.
The smartphone wars were fought among tech giants, not capital intensive hardware startups. The problem with patents is that you need to already be financially successful enough to file them, able to pay to protect them in court, and can float your company's operating costs long enough to see them enforced and rewarded, which may take years.
Yes and no -- filing patents is quite affordable (probably outdated info, but I recall average costs for drafting and filing was ~10K / patent, most of the costs being related to the drafting rather than filing.) Compared to all the other capital investments required for hardware startups, these costs are negligible.
But you're totally right that enforcing them is extremely expensive, slow and risky.
That said, Roomba isn't exactly a startup but wasn't a tech giant either, and did enforce their patents often.
And especially against imported infringing products, the ITC provides a much cheaper, faster mechanism to get protection via injunctions.
That's why the ITC is so relevant here: it is relatively quite speedy compared to regular patent trials, and have the power to issue injunctions against imports (which is partly why it was relied on a lot during the smartphone patent wars.) So you may not collect damages from Chinese companies, but you can completely block their infringing imports into the US and deny them US revenue.
> If you're really the first, you should be able to get about a 20 year head start.
That's an opinion, and not one I agree with.
If you and your competitor are racing to develop a thing, whoever wins by a couple months shouldn't get a monopoly for decades.
Most of the time when things get patented, it's strictly worse for innovation in that space until the patents expire. 3d printing is a great example.
It's asinine to think you can outsource manufacturing of whatever object to some other company in another country, but that no one on the planet can make the same thing because "the idea is yours".
> Most of the time when things get patented, it's strictly worse for innovation in that space until the patents expire.
What happens at expiration is an important and intended feature of patents. They trade a legally guaranteed headstart against the requirement of publishing your methods for your competitors to learn from.
Coasting on their patents is exactly why iRobot went bankrupt. If they had a proper incentive to continue innovating, they might be around today. Instead, the patent system incentivized them to erect a tollgate and snooze away in the booth next to it.
Zhaoxin makes X86 and countless make ARM and RiscV chips. SMIC being a foundry.
> The Chinese Microsoft?
Baidu, Tencent, Alibaba, ByteDance.
>The Chinese Boeing?
Comac makes passenger and Chengdu fighter jets.
>The Chinese NVIDIA?
Huawei makes AI GPUs.
>People forget that the US is still the #2 manufacturer in the world
Considering the US never had its industry blown up in any war and could reap the benefits of 150+ years worth of stability, higher education, skilled immigration, compounding wealth, and taking over the vacuum and brains of Europe's post-war industrial powers, that's not really something THAT impressive.
>and that's (apparently) without halfway trying.
If it isn't halfway trying, why does it feel the need to sanction or ban chinese competitors?
>> US companies can’t beat Chinese companies completely subsidized by their national government.
> Except our companies do just that, all the time. Who is the Chinese Intel? The Chinese Microsoft? The Chinese Boeing? The Chinese NVIDIA?
Where are the new ones?
Also Intel is not doing well, and the Chinese (after a fashion) Intel is TSMC, who also does NVIDIA's manufacturing.
> People forget that the US is still the #2 manufacturer in the world, and that's (apparently) without halfway trying.
So? That fact sounds like pablum. I think the real story of US manufacturing has been one of erosion of capabilities and long-term loss of strength. The US may still have a high ranking, but I'd bet: 1) much of that of that is low-volume and legacy, 2) second-place is still only 60% of what China does.
> People forget that the US is still the #2 manufacturer in the world
Manufacturer of what, exactly, though?
What do you export? What do you sell?
Food? Nope, illegal in most of the world.
Cars? Nope, uncompetitive in most of the world. "High end" American cars lack even basic features fitted to poverty-spec cars in the EU, like heated windscreens.
Computers? I'm typing this on a computer assembled in Scotland onto a Latvian-made chassis using a Chinese-made motherboard populated with Korean memory chips and an Israeli microprocessor.
What does the US actually make and sell, any more?
> I have a few really good hardware ideas, but I don't believe I could ever market them fast enough and far enough to make it worth spending the R&D to make them happen.
>I have a few really good hardware ideas, but I don't believe I could ever market them fast enough and far enough to make it worth spending the R&D to make them happen.
Then make a nice blog post, translate it to Chinese (hell, I'll pay a professional translator for you) and post it on the internet so that someone in Shenzen can try it.
Just post your ideas to crowdsource websites and wait for the aliexpress clone to appear, zero r&d costs, zero dev and manufacturing/qa! That said, Taobao and Ali are so full of bizarre products (transparent rubber domes to be able to type with 5cm long nail extensions), it will be a challenge to stand out
I wonder why nobody has tried to beat the Chinese companies at their own game. The whole schtick is: take a product that people like, vertically integrate and drive down costs. This is like the purest form of capitalism.
* built a lithium refinery
* produces its own battery cells
* makes its own motors and drivetrains
* makes its own car seats
* owns and operates a fast-charging network
* sells direct, bypassing dealerships
* offers insurance integrated with vehicle data
* develops its own autopilot AI
Great point, and to drive it home -- TSLA is the only competitive non-Chinese company in the EV space. You could make the argument that it's one of very few successful U.S. manufacturing company winning on purely technical/capitalist terms, considering the whole U.S.-Taiwan stranglehold on chip mfg
> You could make the argument that it's one of very few successful U.S. manufacturing company winning on purely technical/capitalist terms
Except it's not winning on that at all. It's "winning" because Chinese EV brands are barred from selling in the US. You can't buy an Avatr if you want. It's in fact protectionist regulations that allowed Tesla to retain EV dominance in the US, in the face of Chinese competition.
Tesla was very popular in the Chinese market and globally, including in markets where Chinese EVs aren't banned, until literally this year, which I'd argue is due in part to the trade war.
The real whole shtick is run economy in closed cycle to keep currency weak. Or the good old 1930s trade bloc economy. They're not just good at optimizing costs, they charge appropriately in CNY and inappropriately in USD. Workers don't care about obscene undervaluation in USD so long that they have bacon on the table after few hours of work.
It's not that rare that Chinese products are sold below cumulative costs of Western equivalent products and services, let alone prices. Chinese(<-substitute this with appropriate East Asian nations past and future) economy just isn't coupled well with the rest of the world that USD converted cost calculations would work. This in economic theories is sometimes explained as exports of starvation and/or overproduction, but IMO that make less sense when they've been doing it at scale of multiple decades.
The craziest example of these is Chinese PCB prototyping services: as cheap as $2 per 5 pieces with $5 extra for complete assembly and $15 shipping. $5 each would be darn cheap in the rest of the world, even $50 each for the board and $150 per assembly work would not be so absurd. There's just no competing that.
> I wonder why nobody has tried to beat the Chinese companies at their own game. The whole schtick is: take a product that people like, vertically integrate and drive down costs. This is like the purest form of capitalism.
I think there are a lot of different reasons:
1. A lot of those Chinese competitors are involved in extremely intense cut-throat competition, which drives a lot of innovation that benefits a lot of stakeholders except investors (IIRC the term is "involution"). The the US, the investors a almost literal kings and their returns are paramount, and they'll even throw their own country under the bus if it means their returns are higher.
2. The US (in-general) has been letting its manufacturing capabilities wither for decades, while China has been building them up. Even if you wanted to beat the Chinese companies at their own game, the skills, suppliers, and scale to do that aren't available in the US anymore.
3. Working conditions in China are atrocious and pay is lower, which really helps if you're trying to undercut on cost.
The promise didn't pan out for us. You have to prepare and cordon off the floor, and the unit gets stuck half the time. Somehow it's exactly the right height to get wedged under furniture.
My wife got one to try and automate away the vacuuming. We went through the same thing, and it still needs to be babysit anyways. For the sheer amount of time and frustration of basically all robot vacuums, it has been easier to just get a nice Kenmore upright bagged vacuum and do it yourself anyways (and the results are basically always way nicer).
We've had the Chinese dreame with valetudo (so fully disconnected from the cloud), it works honestly very well. Doesn't get stuck, avoids my son's toys rather effectively and just works. I run it when we leave home.
There's honestly no reason why roomba didn't make something equivalent to the dreame, they could have competed with the Chinese manufacturers on feature and by allowing users to easily disconnect from the cloud. They didn't because the company was completely mismanaged and their products barely evolved.
I'm pretty optimistic about biped robots for this. Either you buy or lease one and a cleaning service teleoperates it, or subscribe to a cleaning service that drops it off and teleoperates for a few hours a week. Suppose it could even walk itself to the next customer if close enough.
If a wheeled low-profile vacuum that stays in a room is too hard to deliver, surely we can fix it by making it walk around and grasp a vacuum cleaner and walk between houses.
They could improve the design and get people to replace their machines with the improved ones, repeat and repeat.
Or they could sell the broken design and people would just buy more as they broke. They don't care if Costco was eating the cost with their in-house warranty.
The fundamental problem though is the same with all "household gadget" products. They look cool, and appear to solve a problem, but that is actually all a perception based on novelty. They actually don't work very well, they are not built very well, and they don't last very long. There's no point in improving them because the concept is fundamentally something people don't need in the first place.
Just buy a good canister vacuum and you're set for a decade or more. It will cost more than the latest gadget from Shark or Dyson or iRobot but it won't frustrate you and it will just reliably do what it is supposed to do without uploading anything to an IP address.
> Just buy a good canister vacuum and you're set for a decade or more. It will cost more than the latest gadget from Shark or Dyson or iRobot but it won't frustrate you and it will just reliably do what it is supposed to do without uploading anything to an IP address.
Cords suck. So I bought a cordless vacuum, and was able to vacuum more. But I also needed a mop because vacuums don't do well enough on my laminate, stuff still gets stuck on. So I bought a cordless mop, so I could map more. This worked great for awhile but...
But it turns out if I did my vacuuming and mopping every night, I could keep my floor in better condition. I don't have time for that, but a robot from Eufy does and doesn't cost much compared to how much I would benefit from it.
Luddism on HN is a bit weird, but I get it, some people don't see the point of automating these tasks because their lives aren't complicated enough yet (e.g. they don't have kids, or have lots of free time and energy to spend on house work).
Cordless is nice until the batteries won't charge anymore. Or the charger stops working. Or you forgot to charge it and now want to use it. Or the charging connector gets worn and unreliable. Then you have an expensive battery replacement or other repair or (more likely) you just replace the whole device because it was made to be unrepairable, and now you have several pounds of plastic and e-waste to dispose of.
Dealing with plugging a cord into an outlet is no more burdensome than picking up the socks or shoes before the Roomba wakes up and tries to ingest them.
If the batteries don't work anymore, I buy a new vaccum. My Dyson was last updated in 2020, it is 2025 now, so I think it is working out? The charging dock works great for not forgetting to replace it.
I guess this is how people felt when they moved from wired phones to wireless phones?
> Dealing with plugging a cord into an outlet is no more burdensome than picking up the socks or shoes before the Roomba wakes up and tries to ingest them.
And dragging the cord around, and having to plug out and re-plug the cord in again because you want to do a different part of the room.
I'm also a Miele canister vacuum owner, and everywhere in my house where I vacuum is within range of a wall outlet. When I'm done, the cord retracts into the vacuum so I don't need to wind it or stow it myself. I guess, for me, that takes care of the issue to a great enough extent that I just never saw an advantage that justified the expense?
If you are ok with it, I think that's fine. Cordless to me is a huge productivity boost since I can just pick it up and vacuum whenever. I think most people see it as a huge win, but I haven't conducted a formal poll or anything.
Having a robot do everything is just another step in the convenience direction. It is great if you have expensive floors that you want to maintain on a daily or bi-daily basis.
> They could improve the design and get people to replace their machines with the improved ones, repeat and repeat.
> Or they could sell the broken design and people would just buy more as they broke. They don't care if Costco was eating the cost with their in-house warranty.
This strategy has limits, and I think iRobot hit those, and they didn't didn't lower themselves to switch to the second strategy of selling cheap unreliable garbage (at least not before 2019, which was the last time I bought a Roomba).
> The fundamental problem though is the same with all "household gadget" products. They look cool, and appear to solve a problem, but that is actually all a perception based on novelty. They actually don't work very well, they are not built very well, and they don't last very long. There's no point in improving them because the concept is fundamentally something people don't need in the first place.
I'd dispute this in this case: Roombas may not have solved the vacuuming problem for everyone, but they solved it for me (at least), and they were built pretty well (reliable, modular & reparable design, etc.).
> Just buy a good canister vacuum and you're set for a decade or more. It will cost more than the latest gadget from Shark or Dyson or iRobot but it won't frustrate you and it will just reliably do what it is supposed to do without uploading anything to an IP address.
1. I've got both, and the Roomba works a lot better than not vacuuming with the canister vacuum at all. It doesn't frustrate me, and it took far less time to Roomba-proof my home than vacuuming it every week for a year.
2. I agree with the IP address thing, but I think at only got added when they attempted to "get people to replace their machines with the improved ones." I have a couple of the older models that have no network connection (and had no plans to buy more due to the unnecessary network requirement).
I remember the "upgrade for pets" option, which... didn't work. After buying the maxed out version I realized that the product simply had a long, long way to go - but iRobot did nothing with it other than launch new segments like "upgrade for vision based mapping" etc.
They were saying that whoever was running things at Roomba must have been duped by the 4 hour work week bs because nothing was getting done. Specifically whoever took over operations, planning, and product improvements from Brooks.
The product itself still exists as Adobe Animate, I think (or one of the Adobe CC tools). It's just as good or better than it ever was, with the same workflow. But instead of exporting to SWF now people just export to video and share it on video platforms. Lots of great stuff still being done with it on Youtube.
Flash was a poorly written piece of software. It had numerous bad memory leaks and a CPU hog. It was never allowed on the iPhone probably because it would have drained the batteries really quickly. On top of that HTML5 was starting to catch on and could eventually do everything Flash could and do it better without the memory leaks and poor CPU usage. I have the very unfortunate claim to the title of being an engineer on the world's biggest Flash/Flex app. The memory leaks were so bad that Adobe advised us to just restart the app periodically -- despite Adobe marketing Flex as enterprise ready. We found compiler bugs for Adobe. Adobe and Jobs didn't set out to destroy it. Macromedia wrote bad code that performed poorly and it wasn't worth the effort for Adobe fix it once HTML5 won.
None of that matters for the kind of creative work the grand parent likely had in mind.
Perhaps there was a memory leak in Unidentified Flying Assholes or the endless line of punch-a-celeb games or the thousands of stick fight productions and so on, but no one cared and enjoyed them immensely anyway. You could do something cool without ever learning about things like memory leaks or vulnerabilities in the underlying platform.
> None of that matters for the kind of creative work the grand parent likely had in mind.
Some of that did, at least for how that creative work was almost exclusively delivered to the world. Those bugs were not just excessive resource usage and instability, they were incredibly often exploitable security flaws that were regularly weaponized against a huge swath of internet users. The ubiquity of the Flash browser plugin was simultaneously one of the greatest strengths of Flash as a creative platform and one of the greatest risks to the average person browsing the web in the 2000s.
The plugin needed to die. Unfortunately the Flash community was so firmly built around the web plugin as their distribution method of choice (presumably because many of us were browsing animations and playing games at work/school where we couldn't necessarily download and run arbitrary .exes) that the plugin was more or less a diseased conjoined twin, and when it died the community didn't have long left.
Compare this to Java where the death of the browser plugin caused a number of badly designed banking sites to have to be redesigned in a less stupid (but quite often still very stupid) way but the community as a whole continued on without huge disruption. The browser plugin was just one of many places Java existed, it wasn't the dominant focus of the community.
Yeah, it's kinda crazy people are brushing over the security issues. The nostalgia is huge, I get it, but Flash was terrible for browsing the internet at the time.
I think they’re referring to the flash plugin itself. It enabled a vast amount of creative work and it enabled vast exploitation of users’ browsers. I worked as a tech at a consumer-focused computer store from about 1999-2005. It was a wild wild world back then. The vast majority of our time was spent removing viruses, browser toolbars, Bonzi Buddy and friends, and helping people understand how their online banking passwords got stolen by the shady porn site they like so much.
It was not a CPU hog - this is a myth that needs to die The flash runtime was pretty modest.
Now, the code people wrote was CPU hogs, because lots of non coders were writing code and they would do anything to make it work. The Flash runtime was not causing the Punch the Monkey and to peg your CPU, it was because the punch the monkey ad was fucking awful code.
All those Flash programmer went on to write the first wave of HTML5 stuff which, shock horror, where vastly CPU inefficient.
Same here, I somehow acquired a pirated copy of Flash when I was 10 or 11. Went through the included offline manual and within a few days somehow knew I'll probably end up doing this programming thing for the rest of my life :D
It's sad what happened to Flash, sure we have plugin free interactive content using JS but I'm not sure if anything has replicated the IDE. Though I guess the decline can also be attributed to the users moving onto other platforms. The kids making games moved on to making Android/iOS games and the animators moved to Youtube.
We could have kept that creative environment (that seem to just disappear without any alternative to this day) while leaving videos to evolve as they did.
People here complain like they have issues with long term memory, but reality was - there was no real web video before. That apple had more issues than others was problem that should have been contained to apple walled garden alone. World was, is and will be much larger than that.
The creative environment could have been built with HTML/JS as well. I feel what killed it more so was mobile gaming took over casual games, and modern game engines enabled a single person who would have been making dinky little flash games to now make what used to take an entire studio.
Maybe, but playing videos was 99% of the use case for Flash by the time it was killed by Apple. Adobe could have kept maintaining it for the 1% Flash games, ads and terrible websites, but you can see why they gave up...
It can be both but it definitely had a security problem
> Mitre lists more than 1,000 Adobe Flash vulnerabilities.
>Flash ranks 14th on the list of products ranked by the number of vulnerabilities – one of only two applications in the top 25 that aren’t operating systems or browsers.
I completely agree with the assertion and the benefits that ensue, but my attention is always snagged by the nomenclature.
I know there are alternate names available to us, but even in the context of this very conversation (and headline), the thing is being called a "variable."
What is a "variable" if not something that varies?
In the cases we're interested in here the variable does vary, what it doesn't do is mutate.
Suppose I have a function which sums up all the prices of products in a cart, the total so far will frequently mutate, that's fine. In Rust we need to mark this variable "mut" because it will be mutated as each product's price is added.
After calculating this total, we also add $10 shipping charge. That's a constant, we're (for this piece of code) always saying $10. That's not a variable it's a constant. In Rust we'd use `const` for this but in C you need to use the C pre-processor language instead to make constants, which is kinda wild.
However for each time this function runs we do also need to get the customer ID. The customer ID will vary each time this function runs, as different customers check out their purchases, but it does not mutate during function execution like that total earlier, in Rust these variables don't need an annotation, this is the default. In C you'd ideally want to label these "const" which is the confusing name C gives to immutable variables.
> In the cases we're interested in here the variable does vary, what it doesn't do is mutate.
Those are synonyms, and this amounts to a retcon. The computer science term "variable" comes directly from standard mathematical function notation, where a variable reflects a quantity being related by the function to other variables. It absolutely is expected to "change", if not across "time" than across the domain of the function being expressed. Computers are discrete devices and a variable that "varies" across its domain inherently implies that it's going to be computed more than once. The sense Carmack is using, where it is not recomputed and just amounts to a shorthand for a longer expression, is a poor fit.
I do think this is sort of a wart in terminology, and the upthread post is basically right that we've been using this wrong for years.
If I ever decide to inflict a static language on the masses, the declaration keywords will be "def" (to define a constant expression) and "var" (to define a mutable/variable quantity). Maybe there's value in distinguishing a "var" declaration from a "mut" reference and so maybe those should have separate syntaxes.
> Those are synonyms, and this amounts to a retcon.
The point is that it varies between calls to a function, rather than within a call. Consider, for example, a name for a value which is a pure function (in the mathematical sense) of the function's (in the CS sense) inputs.
Or between iterations of the loop scope in which it's defined, const/immutable definitions absolutely change during the execution of a function. I understand the nitpicky argument, I just think it's kinda dumb. It's a transparent attempt to justify jargon that we all know is needlessly confusing.
Ah! Actually this idea that the immutable variables in a loop "change during execution" is a serious misunderstanding and some languages have tripped themselves up and had to fix it later when they baked this mistake into the language.
What's happening is that each iteration of the loop these are new variables but they have the same name, they're not the same variables with a different value. When a language designer assumes that's the same thing the result is confusing for programmers and so it usually ends up requiring a language level fix.
e.g. "In C# 5, the loop variable of a foreach will be logically inside the loop"
There are surely more than three! References can support mutation or not, "constants" may be runtime or compile time.
The point is that the word "variable" inherently reflects change. And choosing it (a-la your malapropism-that-we-all-agree-not-to-notice "immutable variables") to mean something that does (1) is confusing and (2) tends to force us into worse choices[1][2] elsewhere.
A "variable" should reflect the idea of something that can be assigned.
[1] In rust, the idea of something that can change looks like a misspelled dog, and is pronounced so as to imply that it can't speak!
[2] In C++, they threw English out the window and talk about "lvalues" for this idea.
The term variable is from math is 100s (probably) of years old. Variables in pure functional languages are used exactly the same way it’s used in math. The idea of mutating and non-mutating variable is pretty old too and used in math as well. Neither are going to change.
Well maybe global constants shouldn't be called "variables", but I don't see how your definition excludes local immutable variables from being called "variables". E.g.
fn sin(x: f64) -> f64 {
let x2 = x / PI;
...
Is x2 not variable? It's value varies depending on how I assign x.
Anyway this is kind of pointless arguing. We use the word "variable". It's fine.
Even if the term 'variable' has roots in math where it is acceptable that it might not mutate, I think for clarity, the naming should be different. It's uneasy to think about something that can vary but not mutate. More clear names can be found.
That's not a constant, that's an immutable variable which is why your diagnostic said it was read-only.
const int x = 2;
int *p = &x;
*p = 3; // Now x is 3
And since I paid for the place where I'm writing this with cash earned writing C a decade or so ago, I think we can rule out "unfamiliar with C" as a symptom.
Now x is 3 but you also get a compiler warning telling you not to do that.
In my opinion it's a bit disingenuous to argue that it isn't a const just because you can ignore the compiler and shoot yourself in the foot. If you listen to the compiler, it is reflected in the assembly that it is a constant value the same as #define x 2.
Is Rust better at enforcing guarantees? Of course. Is `const` in C `const` if you don't ignore compiler warnings and errors? Also of course.
> And since I paid for the place where I'm writing this with cash earned writing C a decade or so ago
Perhaps they’re conflating how you can’t use “const” as a compile time constant (e.g., you can’t declare the size of an array with a “const” variable). If so, C23 solves this by finally getting the constexpr keyword from c++
> What is a "variable" if not something that varies?
If I define `function f(x) { ... }`, even if I don't reassign x within the function, the function can get called with different argument values. So from the function's perspective, x takes on different values across different calls/invocations/instances.
Variables are called variables because their values can vary between one execution of the code and the next. This is no different for immutable variables. A non-variable, aka a constant, would be something that has the same value in all executions.
Example:
function circumference(radius)
return 2 * PI * radius
Here PI is a constant, while radius is a variable. This is independent of whether radius is immutable or not.
It doesn’t have to be a function parameter. If you read external input into a variable, or assign to it the result of calling a non-pure function, or of calling even a pure function but passing non-constant expressions as arguments to it, then the resulting value will in general also vary between executions of that code.
Note how the term “variable” is used for placeholders in mathematical formulas, despite no mutability going on there. Computer science adopted that term from math.
It's a variable simply because it doesn't refer to a specific object, but any object assigned to it as either function argument or by result of a computation.
It's in fact us programmers who are the odd ones out compared to how the word variable has been used by mathematics and logicians for a long time
Right, yeah, it’s a funny piece of terminology! The sense in which a ‘variable’ ‘varies’ isn’t that its value changes in time, but that its value is context-dependent. This is the same sense of the word as used in math!
Some languages like Kotlin have var and val introducing the distinction between variables (that are expected to get reassigned, to vary over time, and values, which are just that, a value that has been given a name. I like these small improvements.
(unfortunately, Kotlin then goes on and introduces "val get()" in interfaces, overloading the val term with the semantics of "read only, but may very well change between reads, perhaps you could even change it yourself through some channel other than simple assignment which is a definite no")
You could always interpret a variable from the perspective of it's memory address. It is clearly variable in the sense that it can and will change between allocations of that address, however an immutable variable is intended to remain constant as long as the current allocation of it remains.
It leads to a further ambiguity, because "value" is something that is assigned to a variable (or whatever we call it). For example some Rust code:
let v1 = Vec::new();
let v2 = v1;
The first line creates value of type Vec and places it into variable v1. The second line moves the value from variable v1 to variable v2. If we rename "variable" to "value" then my description of the code will become unreadable.
If I was as pedantic as the OP, I'd use "lexical binding" instead of "variable". But I'm not sure how it will work with C and C++, because their semantics assumes that a variable has a memory associated with it that can hold a value of a given type. Modern compilers are smarter than that, but still they try hard to preserve the original semantics. The variable in C/C++ is not just a name that ceases to exist after compiler have done its work. It creates a possibility that calling C/C++ variables "lexical bindings" we'll get more pedants accusing us of improper use of words, even if we never change values of those variables.
I can't speak to the particular browser application. I haven't installed it and probably never will, but the language around text interfaces makes the OP sound... uninformed.
Graphical applications can be more beautiful and discoverable, but they limit the user to only actions the authors have implemented and deployed.
Text applications are far more composable and expressive, but they can be extremely difficult to discover and learn.
We didn't abandon the shell or text interfaces. Many of us happily live in text all day every day.
There are many tasks that suffer little by being limited and benefit enormously by being discoverable. These are mostly graphical now.
There are many tasks that do not benefit much by spatial orientation and are a nightmare when needlessly constrained. These tasks benefit enormously by being more expressive and composable. These are still often implemented in text.
The dream is to find a new balance between these two modes and our recent advances open up new territory for exploring where they converge and diverge.
Am I the only one that interpreted OP in a way that they weren't opposed to neither CLIs, TUIs, nor GUIs at all? The topic wasn't "textual interface VS graphical interface", but "undocumented/natural language VS documented/query language" for navigating the internet.
In addition to the analogy of the textual interface used in Zork, we could say that it'd be like interacting with any REST API without knowledge about its specification - guessing endpoints, methods, and parameters while assuming best practices (of "natural-ness" kind). Do we really want to explore an API like that, through naive hacking? Does a natural language wrapper make this hacking any better? It can make it more fun as it breaks patterns, sure, but is that really what we want?
I'm not focused on this particular browser or the idea of using LLMs as a locus of control.
I haven't used it and have no intention of using it.
I'm reacting to the OP articulating clearly dismissive and incorrect claims about text-based applications in general.
As one example, a section is titled with:
> We left command-line interfaces behind 40 years ago for a reason
This is immediately followed by an anecdote that is probably true for OP, but doesn't match my recollection at all. I recall being immersed and mesmerized by Zork. I played it endlessly on my TRS-80 and remember the system supporting reasonable variation in the input commands.
At any rate, it's strange to hold up text based entertainment applications while ignoring the thousands of text based tools that continue to be used daily.
They go on with hyperbolic language like:
> ...but people were thrilled to leave command-line interfaces behind back in the 1990s
It's 2025. I create and use GUI applications, but I live in the terminal all day long, every day. Many of us have not left the command line behind and would be horrified if we had to.
It's not either/or, but the OP makes incorrect claims that text based interfaces are archaic and have long been universally abandoned.
They have not, and at least some of us believe we're headed toward a new golden age of mixed mode (Text & GUI) applications in the not-so-distant future.
CLIs are still powerful and enjoyable because their language patterns settled over the years. I wouldn't enjoy using one of these undiscoverable CLIs that use --wtf instead of --help, or be in a terminal session without autocomplete and zero history. I build scripts around various CLIs and like it, but I also love to install TUI tools on my servers for quick insights.
All of that doesn't change the fact that computer usage moved on to GUIs for the general audience. I'd also use a GUI for cutting videos, editing images, or navigating websites. The author used a bit of tongue-in-cheek, but in general I'd agree with them, and I'd also agree with you.
Tbh, I also think the author would agree with you, as all they did was making an anecdote that
s/Take/Pick up/
is not that far off from annoying its users than
s/atlas design/search web history for a doc about atlas core design/
is. And that's for a product that wants to rethink the web browser interface, mind you.
We are more rapidly heading towards (or already in) a future where the average household doesn't regularly use or even have a "computer" in the traditional sense, and a CLI is not just unused but entirely non-existent.
Anil is an old-school technologist. He helped Movable Type which was arguably the first blogging CMS. He also worked at Fastly and Glitch. I'm sure he knows how CLIs work and what they're good for. And surely no one here is suggesting that for 99% of normie users that they'd be comfortable with CLI just because it's good at piping one command output to another. Even those of us who are proficient use GUIs more often.
No, it is not. You may think my contract manufacturers lack skill. (No comment from me.) But I have to live with them and make things work.
I try to get my CMs involved in design early. I think it is telling that whenever I give them choices, they reject Tag-Connect and pick one of the other options. Every. Single. Time.
The connector that was specifically flat-out hard rejected was the TC-2070 14-pin version. The number of pins was part of the problem. Apparently (this was a while back now so I may be misremembering) they had trouble with the density at 0.050": 6 pins gives you a lot more room on the sides to squeeze stuff in than 14 does. So they have to do it with a special premade block that comes in and hits the pads, and that block was nearly-but-not-quite unobtainium for the 14-pin version. The CM hated the Tag-Connect in general and wanted it gone, so we didn't trust them too much, but then we tried to build the fixture in house and prove them wrong... after that experience I have joined them in their hatred.
The fact of the matter is that there are many, many other good ways to do it, so it's not Tag-Connect or nothing. Castellations are right out in HVM because of the cost hit, so that rules out Edge-Connect and friends. Würth has WE-SKEDD which looks like the same general thing as Tag-Connect but I've not had cause to try it.
My favorite thing to do, if space allows, is to just put down the unshrouded surface-mount header. Cortex and ESP parts nominally use a standard 0.050" header and you can just place it down. Then don't populate it, and you've got an array of pads that are long enough to stagger test probes on to in a bed-of-nails, or for bench use it is very easy to hand-solder the header on. Plus it's surface-mount so the space below the header is available for use (often things like pull resistors or ESD diodes go nicely here). The biggest wrinkle here is the solder stencil. You do not want to have paste put on these pads if you're not soldering the header, because you badly want your test pins to hit clean ENIG finish and not flux-covered no-clean solder (doubly nasty to probe, even clean solder is bad enough). So it's harder to do a small run of 100 bench-debug boards with headers then the rest as production. You usually end up just soldering the header by hand (or having the CM do it), which is OK.
Otherwise it's traditional pogo pads all the way. This is pretty much required anyway whenever the board is too small for other methods (did I mention Tag-Connect is huge? Tag-Connect is HUGE.) and it works great as long as you were already planning on fixturing.
With the information you're giving, my decision would probably be to take the non-clip TC2030 or TC2050 (I've never needed 14 pins) footprint and overlay the footprint for a regular 1.27mm SMD connector on top of it. Cortex debug connector should be a good fit but I haven't checked.
That seems to be the "get your cake and eat it" (though it does mean you're spending the space and drilling the holes for TC.) But still -
> They're fine for one guy using them on the bench
If you're insistent on Tag-Connect, that's a pretty good way to go. Those legs are a big part of the problem and so the No-Leg version helps a lot. But then it also falls out of the board....
Seriously, I tried to like Tag-Connect. I did like it before supporting a CM and a hardware team trying to use it (and lose the cables...). Now I just plain don't think it adds value over the alternatives. The header is three cents. Three. Cents. The cable is $39 (with legs, $34 without). That buys you over 1,000 headers and then you can use the free cables that come in the box with all the debug probes and live in the pile over there in the shop.
Bending the little pins also works (they will tend to bend themselves after a little while of use anyway), but at the cost of making insertion a little harder as well. I found that to be the best compromise for me, but YMMV.
> If you're dealing with truly high volume runs, get the ICs programmed before they're on the boards.
That is absolutely 100% the way to go.
However, this week I got tremendous pushback on this for a current project with a build volume in the mid six figures... exactly where you'd want to be using pre-programmed parts. It took me a while to figure out what was really happening was the firmware lead (who thinks himself an EE) was spewing out nonsense to cover for the fact that he'll never be ready to deliver in time to preprogram anything. I wish he'd have just said that and saved us all the nasty argument....
Trying to write Rust as a noob feels like being in a blackout swamp, waist deep in muck, fighting through thick air and trying, but failing to run from danger. Being utterly constrained in every direction.
Writing Zig as a noob feels like having the clouds part to reveal a bright, sunny Spring clearing and suddenly being able to cover land in a cheerful group with clarity, purpose and focus.
[Edit] Of course, this is purely subjective, obviously a skill issue and YMMV
This news is making me much sadder than maybe it should.
Arduino is what pulled me into electronics. I have such fond memories of those old chonkers blinking LEDs. It felt like magic.
Unless they've had a major staffing and leadership shakeup, there is a zero percent chance Qualcomm is going to suddenly become some kind of open, sharing, culture. The company DNA is patent troll.
The recent joint ventures are a perfect example. I got so excited by those newish super powerful penta-whatever Qualcomm chips from Arduino a few years ago.
Then learned the chips were unobtainable outside the Arduino modules.
No one is arguing for using ISP-hosted accounts as an alternative.
The core problem isn't even rooted in identity per se, it's about platform owners actively working to limit access to essential information from platforms they cannot profit from.
Even granting the most cherubic motives, this ongoing behavior is atrocious on it's face and should be prevented by any means, including competition, rule making and legislation.
I was a very early customer of Roomba and loved them when they came out. I had pets at the time, and the machine would consistently fail in about 14 months. I finally figured out that I needed to buy them from Costco, so that I could get them replaced.
Rather than taking their lead and improving the product, they just sat there with the exact same product for like 10+ years. It was outrageous.
I guess Rodney Brooks got busy with other interests, and whomever ran things didn't realize that Tim Ferris is full of shit.
It was extremely frustrating to watch these assholes destroy the company right from the outset. All they needed to do, was to slowly walk forward and iterate with improvements.
The only surprise in this news is that it took SO LONG for them to dismantle the company.
I do not think it's appropriate for an organization holding this much deeply personal data can be sold to any foreign entity.
reply