This is so true, and even today with new designs you end up with overkill. A Cortex M0 (32 bit ARM system) with 128K of flash and 64K of RAM is 75 cents in quantity. That means the processor complex is essentially "free" with respect to the cost of the other bits (sensors, actuators, communication over distance). The risk is that with all that "extra power" the programmer decides to use it creatively for something like "A built in web server to show you status of everything" and that "feature" requires you connect it to the wider network, and that "webserver" never gets patched, and now you have an HVAC system which becomes the exploit vector to get into a much bigger facility/network.
All because designing in a "limited" computer didn't make economic sense, and programmers couldn't help but use the extra CPU capacity that was available.
That is what makes IoT a challenge / bad-idea to a lot of people.
Spot on. Programmers in general are not well-versed in security. I don't mean you, the reader of this comment. But as a collective. The other people who write code through which you can pass buses full of black hats. Not you. Web applications with huge budgets get owned by common mistakes. And those web applications don't have access to anything but data. Imagine having all those same, eh, security issues, in devices that can interact with the real world. I really don't want my microwave to suddenly turn on and keep going for hours at time while I'm out of the house.
I've noticed the opposite thing. Most of the hardware around is built on the weakest specs that still let the thing run. Those various 10 cent savings on flash and μC tend to add up quickly when you go into mass production.
But the primary problem, which is not limited to but obviously visible in IoT, is that companies ask themselves "what sells?" instead of "what is good and useful?". All that crap that is being created, with useless "features" that introduce security holes foster the fragmentation of the ecosystem, is pushed because someone out there figures out that people will buy it. But almost no one understands the implications of all these "features" so the buying decision they make is usually wrong and stupid.
I wish someone cut out sales people from the design process. You should be able to get designers and engineers together, having them ask themselves what would be an optimal, actually useful smartwatch/smartfridge/smarttoilet/whatever and how to build it, and then tell sales people to figure out how to sell it. But no optimizing for better sellability.
I too have seen the intense penny pinching, here in California the soda bottler removed one thread from the tops of plastic bottles, it saves probably a fraction of a cent in plastic, but makes the detached retaining ring for the cap rub on your lips when drinking. That makes it uncomfortable to sip from those bottles. Such a huge price to pay in user dissatisfaction for such a small savings.
Can't go this far though :
> I wish someone cut out sales people from the design
> process. ... no optimizing for better sellability.
In my experience, actually doing things this way leads to less economic success for the product and eventually it gets outsold by a competitor without those restraints. And At FreeGate I told sales people "you have to sell what we have, not what we don't have" and still had them come back with complaints about how the competitor could install their box in a data center etc etc. Not a productive conversation (or fun for that matter).
There does seem to be a minimally required feature set for selling things these days. "High Quality" isn't the compelling feature it once was.
There's some penny pinching, for sure - I had a coworker whose brother is on the iPhone hardware team and they have a lot of trouble with samples coming back from manufacturing with the wrong resistor here or a missing capacitor there to save a few bucks, because the factory sees it as overengineering, but doesn't understand the purpose it's built for.
That said, relying on an older processor may actually not save money. Sure, there's a premium on the absolute newest processor, but in general what's cheapest is what is most mass produced Right Now(tm).
I think a z80 on something like this was likely similar to the reasons that NASA control systems typically use the most reliable hardware they can, which means something that has been in use for many years.
For HVAC, maybe a little of each, but also the software may have been written to the z80, and if you change that out, you have to do all the testing you'd have to do if you built a new machine.
I often think back on this old chat I had with my grandfather, where he kind of tilted his head at something I was explaining about 90s tech and said something like:
"Interesting. In my day, we programmed the software to the hardware, it kind of seems like now you all are programming the hardware to the software."
> I had a coworker whose brother is on the iPhone hardware team and they have a lot of trouble with samples coming back from manufacturing with the wrong resistor here or a missing capacitor there to save a few bucks, because the factory sees it as overengineering, but doesn't understand the purpose it's built for.
I find that story utterly implausible.
The day Foxconn makes unapproved changes to Apple designs is the day that...well, never.
I think you have an interesting point but it ignores humanity. People want what they want for different reasons. As a marketer, it's probably easier to give people what they want than to change people's minds to accept what they need. I blame neither person in this situation only I'd try to change the system which surrounds them.
I know that salespeople can often be the source of bad decisions, but determining market fit is still vitally important. Who wants to build (or, more importantly, fund) something that no one wants?
It seems like a service discovery system for IoT devices might be a good idea where the service discovery system is tracking what is actually allowed to run a particular service - like an HVAC system running embedded webserver.
For example, imagine if the industry had it so that the HVAC system was to announce that it had the capability to be an embedded webserver for status - but it instead should check to see if there is a different host to where it should send its metrics. This way you could control what the core host is for said website - and have all systems in the community basically ask for direction on self-hosting or publishing...
You have to not do that. Cycles you don't use cost absolutely nothing. Indeed, they may actually reduce jitter if you're careful ( they should not but that's another story ) .
It seems insane to think that a programmer would just "decide" to build a feature like that. It would have to be decided by the product people, who probably think it's a good idea.
That is exactly right. A product person who wants to sell this system wants as many bells and whistles as possible because, hey who knows what the one thing is that will push the customer over the edge into the "buy zone" right? So you end up with all sorts of stuff in there. In my experience it is rare to have a programmer who will push back on that request.
All because designing in a "limited" computer didn't make economic sense, and programmers couldn't help but use the extra CPU capacity that was available.
That is what makes IoT a challenge / bad-idea to a lot of people.