Well yes, but "embedded" is a pretty broad category.
Some major cases are:
- High-volume and cost-driven: where a you're picking the cheapest MCU you can make work (often by a matter of cents).You want to squeeze every last bit of performance (or IO, or power consumption, or pinout, or whatever the drivers are).
- Real-time: where abstracting the hardware beyond a certain point is counter-productive.
In my experience it's much less common that you don't care about the hardware than the reverse.
Unfortunately yes. Lots of embedded software development is done by folks without a software background. EE's and electronics technicians dabbling in firmware.
To be honest I am not sure who those dabbling in software actually are. A lot of the embedded software running in safety critical environments on certified compilers and is build and tested to a much, much higher standard than most people would think. This is so much closer to software engineering than what can be observed by "software developers" who pull in unchecked dependencies like there is no tomorrow.
Sure there are also electrical engineers who don't actually know how to program and they indeed dabble in it in a very naive fashion, but you got those people in software developement as well and they are also naive — just in a different way.
I think a big part of the reason is that embedded software payscales seem to be closer to EE pay than typical SWE pay. I've never understood why but from what I can tell the more specialized EE knowledge is not valued very well. I had a colleague who moved out of embedded software to become a backend developer for this reason.
There's thankfully some room in the market for pay reasonably comparable to typical SWE payscales, but most embedded developers live in their own little worlds largely disconnected from trends in the rest of the software industry. They're often simply not aware of what they're worth to demand it and the kind of companies that need embedded developers tend to be both unable and unwilling to pay those rates.
That’s pretty much my situation, I originally did electronic engineering and firmware but have switched professionally to backend software. Partly because of the pay, but mostly because the tooling and quality of most embedded software is so painful to work with. I’d rather have an overgrown dependency tree than a debugger which crashes more often than not.
A bit of a generalisation, but I tend to find when a traditional EE writes firmware, they treat the firmware as controlling the hardware and are more direct with their instructions. i.e. if they need to set a gpio they will often do it directly rather than abstract it. So an EE will write gpio23 = 1, whilst a more software background person will create some type of function say, turn_amplifier_on which will then take the appropriate action.
For those tiny projects it doesn't matter too much. However it seems with demands for features/time to market we rely more on vendor supplied SDK's so this direct approach is becoming less common.
I do find the EE's to be great at bitbanging though, especially when timing is critical.
Historically? Definitely. It wasnt that long ago that bare assembly was still dominant in embedded, long after general computing. Somewhat obviously, assembly programs were closely coupled to the hardware.
Nowadays C is probably still dominant is C++ is starting to get widespread acceptance.
Step #1 - Strong arm the business and hardware team into putting a ton of ram on said embedded device together with a relatively up to date SoC ARM core
Step #2 - Run Linux on it (preferably with in-tree drivers)
Step #3 - Application development is now Somebody Else's Problem
On a serious note, unless your power envelope is tiny or you have hard real-time constraints, you really shouldn't be pushing small microcontrollers to the limit and running custom networking stacks on those. That's a security disaster. Don't connect things to the internet if you don't have a reliable way of updating the firmware.
If you have the scope to run an embedded project with architectural pontification, you're likely both (A) paying too much for hardware, software or people; and (B) probably ignoring the path of least resistance with available code.
This comment baffles me. I don't find anything in the article to be remotely controversial. HALs, testability, and avoiding spaghetti are just straightforwardly good ideas that a significant number of embedded developers don't know about because they haven't read a programming article since they left college.
As an example, have you not seen virtually any embedded project that doesn't do something like this?
FWIW, I actually work in the industry, coming from traditional software.
It is my experience that there is no need for architectural discussions on anything at the micro-scale because if you are implementing effectively for a commercial project you are likely to find that either (A) full abstraction is either overkill/infeasible (low end); or (B) you have a codebase to move with already, likely vendor backed (mid-range).
If you are above micro scale and dealing with a fully fledged RTOS or GPOS, then such abstraction will certainly be baked in.
My point was: this makes the article a bit strange, as such a general observation I wonder who the intended audience is. Perhaps my impression of the industry is unique, but I doubt it. Further, it seems the author is a consultant selling middle management sauce to big companies.
For context, I also work in the industry and have seen embedded codebases ranging from several thousand lines up to tens of millions.
The region where A is true for me is roughly about an 8051 whose entire codebase could be held in a single developer's mind at one time. That's an increasingly miniscule portion of the industry for many good reasons.
For B, my experience is that your codebase often long outlives the products and hardware it's shipping on and is doing very nontrivial things like wireless networking. The article is saying you should have a way to run the code without hardware (obviously useful for testing/CI/bring up) and with a HAL (which makes porting easier), among other common sense suggestions. Vendors don't always provide this, but my info might be out of date because I haven't been able to work in this space for some years due to my salary expectations.
If you have a full RTOS (e.g. FreeRTOS, RTEMS, etc) you definitely do not get the ability to do these things for free. You need to actually have an architecture and work to keep things like raw hardware accesses isolated. I've spent literal years of my life cleaning up codebases where this wasn't done, so I have strong opinions on the matter.
We seem to be essentially in agreement, although I would say A is technically most products (both new, and ever produced) one might qualify that by saying they're likely not the sort of thing you would see volume on in a western embedded career. I think the other comment regarding scale being absent from the article hits the nail on the head.
Working this industry my whole life, this brief article doesn't address scale. Embedded runs the gamut from sub kilobyte to multimegabyte, multi-dozen heterogenous architectures with het networks/interconnects. The level of arch described here is very nice to have but incompatible with program managers and customers in most - but not all - markets.
Most coding is done with HALs and platforms like STM32 make it easy with numerous portable APIs from libopencm3 and so on. So the chip shortage wasnt a disaster. 5-10 years ago was scarier but we have gleaned much info from general sw dev, from rev ctrl to sqa, security, unit tests, integration, regression, linting and my personal dislike: source formatting- no thanks for that one, guys! :)
Ofc im speaking mainly as a professional. There are still many prototypes on arduinos shipping....
Security tho is still a weakpoint, but blame short-term business needs for that 50%.
CI for embedded tho is still a huge opportunity. I see the wheel reinvented too often where a turnkey solution could win TTM and incr reliability.