I guess this is why I lost interest in software that does not have a physical aspect rougly 5 years into my career doing random web development and being the tech cofounder at a shitty startup (though, successful enough to bootatrap me out of parents' nest): it just felt .. disconnected from the greater world.
There is something magical in watching the software you wrote sling full size wood logs around (industrial automation), control and maintain a positive pressure in painting chamber, take off and fly away somewhere.
And even the greater cost of failures/mistakes or the insane hours to debug a single bit set incorrect somewhere deep in SoCs RAM controller is not enough to offset it- you learn so much about the way world works (or doesn't- but that's something to learn as well).
An off-by one error crashing and snapping conveyor belts, too large a P coefficient in pressure controller bending the doors, missing the requirement of closing the shutters for an outdoor heat exchanger leading to freezing and bursting it- though frustrating at the time leads to much better memories than debugging a PHP file deployed on a VPS with logging disabled.
I feel the same way. I've been doing web-related stuff for about 13 years now and have lost much of my interest. Started dabbling with electronics and embedded systems about 5 years ago and it completely re-piqued my interest in writing software. I'm still doing backendy web things for money but would love to move away from that. Any recommendations for moving into the embedded field professionally? What sort of companies are you working for?
Well, I don't think there is an easy way- I just walked away from the startup and lived on ramen for a couple of years, doing odd-ball contracts just to afford rent. During that time I learned a lot, essentially by telling prospective customers: "I do not have direct experience in this, but I have a pretty good idea on the involved steps. I am willing to do this as learning experience at a reduced rate". Many did agree, some projects turned out better than others- but eventually I did acquire enough experience to pass interviews at "real" companies.
Past two companies hired me specifically for UAV projects, so my professional experience has been in this field- however, it is so wide, that you can easily branch out:
* design one-off test hardware for various bench/field tests where an off the shelf solution doesn't fit (custom thrust stand, spring constant measurement jig to determine properties of vibration isolation solutions, simple blackbody for QA to determine if the LWIR payload meets specifications)
* like physics/control systems? Lot of work on tuning PID controllers, implementing simulators for pure SWEs to not crash your expensive prototypes
* hmm, it looks like we have excessive vibrations on the frame- whip up a small battery powered, WiFi enabled high-frequency accelerometer logger to dig deeper into it.
How does one get contracting opportunities to deliver projects before being able to pass interviews to get a full time job? Where would on find these opportunities today?
I always assumed it was the other way around (experienced full time engineer becomes a specialized contractor).
As much as I hate the word: networking and saying yes when you're in the right time and right place.
You probably won't convince a professional widget manufacturer that you're __the__ person for their embedded needs. But a smaller company, that's dipping their toes into branching out from their core business who need to prototype their idea? A lot easier.
Think hard about it, personally I think everything has a shelf life. You might be more engaged for 5 years or so then back at square one within a less profitable industry.
I went straight to embedded after grad school. I enjoy it well enough, more than I think I'd enjoy pure software/web stuff. But at the end of the day it's a job not a passion. If I won the lottery I think I'd be perfectly happy never touching it again. I wonder if I should have gone the software route and taken a job I might even actively dislike for 1.5-2x the the money and remote work which would put more time and money into the rest of my life.
If you still want to go down that path look into embedded Linux for IOT type applications. My product has both microcontroller firmware and a Linux SBC. Hiring for the Linux side is difficult, currently we farm it out to a good but not great team overseas. If you have solid Linux skills, a good handle on build systems, and the ability to bring up modern software best practices in testing and deployment from scratch (most embedded shops are way behind the curve here) you'll be seen as a rockstar. Can probably command a higher salary than the average firmware dev since you're "software" but still a hit compared to FAANG/Unicorn web programming.
Ironically, I've been feeling the opposite lately- my day job is embedded, and I was reminiscing about my childhood days when I was learning to code with Scheme and Processing. There's something beautiful about a single laptop being a fully enclosed ecosystem to just play around with pure code, and I basically haven't coded for fun in my free time in a few years. I was even considering trying to learn some webdev to make my own personal site. :P
I still love embedded, and I think it's great for all the reasons mentioned in the thread. As for your question, firmware is quite broad, and there are lots of different routes people take to break into it as a career. At its core, it's a lot of writing systems code, or feature code that runs on constrained systems. Something like CS undergrad systems programming + OS or equivalent should be enough.
After that, I would make sure you have decent breadth: knowing how to read datasheets, bringing up parts, writing a linker script, using peripherals, grokking some standard protocols like I2C/SPI/SDIO, basic circuits knowledge, basic comparch knowledge, etc. The nice part of all this stuff is that you can totally get there with just hobby projects and Adafruit boards. If you find yourself using the batteries included in the Arduino or Adafruit libraries, maybe push yourself to write more of it from scratch. eg. write an SDIO driver to read from an SD card and parse a FAT filesystem.
People with solid software engineering backgrounds are pretty valuable in embedded, so I'd emphasize that as well. Knowing that git exists, unit tests, API design, CI tooling, etc. If you're a Linux kernel hacker, embedded linux is also kind of its own separate world, so that might be a good direction if you have experience there.
It pays very little, there's very few opportunities to move around, you need to have relatively good practical electrical/electronic skills and the technical skills you do get don't contribute at all if you're trying to bootstrap side-projects.
Yes, that probably depends on the relative availabilty/demand in your location. I personally __probably__ would have earned a bit more overall if I had done web stuff, but at least in Eastern Europe, experienced embedded people can get similar comp than they would at web/mobile/enterprise jobs.
Congrats on moving out. I'd say the need for electrical skills depends a lot on what you're doing and the state of your company. $BIGTECH with large EE team or embedded linux? Probably ok if you can read a schematic and nothing else. Microcontroller firmware at a small/medium company? Better bring it then. Hardware issues can show up in strange ways. I had some firmware hardfaults that seemed to be stack corruption, root cause was a completely screwed up analog circuit.
Alternatively you can build software tools for electrical engineers and the like. It's usually desktop applications but backendy web dev skills do translate
I felt the same way until I had an epiphany recently, not a smart new idea kind of epiphany, but one where the obvious came back into clarity.
Years in industry had conditioned me to seek validation from my peers in the form of completing projects quickly, positive code reviews and just generally being thought of as a "smart" programmer. That isn't really my job though, my job is to create something. If I don't suck I will understand the essence of what I am creating, whether it should be pretty, easy to use, flexible, fast, or whatever and make sure it embodies all those qualities. I know it is a really obvious thing but realizing it brought back a bit of my passion. We as engineers are so often called upon to assess feasibility and make estimations around projects, but it is really important to take a second and forget about coding entirely and picture why a user would be impressed with what you made, and then making sure you deliver that.
I've been fantasizing about making a similar career switch for quite some time. I actually had the opportunity to do so 3.5 years ago, but decided to go for a more "boring" offer in an interesting city in another country. That decision has been fantastic for my social life, but professionally not very fulfilling.
The difference between medicine and engineering is that largely the AMA has endured that management is a support function and medical doctors largely remain in power in most practices. In engineering, particularly software and electronics we’ve allowed parasitic management to take over under the cover that “engineers aren’t people persons”. Look at how many of the awards in our fields go to management rather than the actual engineers that created the inventions that have revolutionized the world.
I have been a professional software developer since the late 80s. I was getting a bit jaded and set up my own 1-man software business in 2005. I do all the programming, testing, documentation, support, marketing and sales. The variety of work, the freedom to do what I want and the fact that the financial rewards are directly linked to my work all help to keep me engaged. Interacting directly with my customers also helps. I have stuck with the same basic toolset (Qt and C++) which have grown with me.
I don't regret my decision. Of course every path has it's plusses and minuses. It was very hard work for very uncertain returns at the start and I have to take a laptop on holiday. And working on your own can be a bit socially isolating. But the upsides (for me) are much bigger than the downsides. I've written quite a bit about what it is like to run a small software business on my blog, if you are interested. Maybe start with:
https://successfulsoftware.net/starting-a-microisv/
It will take time to find the right balance of clients and work, nurture opportunities and give them time to mature. You will probably slog through a lot of one offs for a bit, it’s possible to endure if you become dependable for a couple of SMEs that aren’t large enough to in-house yet.
> Interacting directly with my customers also helps
That does help a LOT. Interacting with actual people who are going to use the software puts everything into perspective like nothing else can. It doesnt matter whether those people are end-user customers or stakeholders inside the organization or another team. People make it worthwhile.
I did consider writing a rostering/scheduling app as a follow on to PerfectTablePlan. But it seems to me that the constraints vary wildly from one organization/market to the next. Making it very hard to create an 'off the shelf' package that is general enough to be useful while simple enough to be useable.
Not the person you replied to, but have been studying rostering the last few months out of curiosity. What does the rostering app you currently use (if any) do wrong and what would make it better for you?
I've seen this happen so much with IT people it became a bit of a clichė: Around 35, pure IT is not enough anymore.
IT people just drop out. The simple cases become managers or architects. The more advanced cases start a bakery, go work in a call center. One of the most extreme cases was a very intelligent, very cynical, very anti religion guy who just quit without warning and joined the hare chrisna. We got photos from him in red clothes doing some kind of ritual. Huh?
A big part for me is that IT just doesn't learn. Every 5 years, a new generation pops up, invents completely new tooling, and makes all the mistakes from the previous generation. Now your knowledge is obsolete, and after you relearn everything your tooling is worse than where you started. Enter a few years of slow tooling maturisation with a very predictable outcome, after which a new generation pops up, declares the existing stuff too complicated, and reinvent everything again. 35 is 4 or 5 of these cycles, bringing to front the huge wastefull uselessness of it all. Learning your whole life is a nice slogan, but becomes very pointless.
The survivors that continue in IT, deal with it somehow. You enter a new cycle knowing it will be change but not much advancement, and don't learn the stack as deep as you used too. You get a life outside IT: Kids, hobbies, social events. You let the youngsters run before you, smile when they do better than your old tech would, and compare with older tools when they get stuck. And you keep some of your tech enthousiasm just for the hell of it.
I wonder about this one. I became a manager because the organizational problems can be so frustrating and I wanted to fix them. However, what you get is pretty much only the aspects of tech work that the author calls out as burning people. A few days ago I wrote a simple Rails app for a young relative's hobby. Most satisfaction I've had with anything related to tech in years.
> I became a manager because the organizational problems can be so frustrating and I wanted to fix them.
The biggest "carrot" held out to idealistic people wanting to become a manager is exactly this: the (fake) opportunity to "fix things". Only after you've become a manager does it become apparent that long-lasting problems are long-lasting for a reason and that none of the choices low-level managers are allowed to make can meaningfully move the needle. Sometimes they fall for it a second time, but being a manager of managers is often even worse since now you don't even have direct reports to effect change.
I've long wondered why we insist on modelling company structure as strict trees / hierarchies. Perhaps there are other classes of graphs which would be better suited in order to avoid such situations?
In any large company there is the official HR hierarchy and then the informal collaboration graph of how work actually gets done.
The former is necessary to coordinate coarse-grained decision making, policy and vision that needs to be unified across thousands of people, most of whom will never meet each other, but who should ideally be rowing in the same direction.
The latter is necessary for the armies of individual contributors to get their respective jobs done. Trying to document and formalize this ad-hoc network holistically is impossible because it's too complex to be understood by any one person. Attempting to do so would require a non-trivial time commitment from all the workers, which would actually take away from them, you know, getting work done.
It's tempting to look at an org chart and assume this represents how things work on the ground, but Conway's Law is more ironclad than it may appear at first glance. Don't confuse legibility for operational capability. If lower level folks did not understand their goals and improvise, then large corporations would be even more rigid and brittle than they already are. They would be utterly incapable of responding to changes in the marketplace and smaller firms would dominate.
The official hierarchy still represents decision power. I'm not suggesting trying to document all work relationships but wondering whether a non-hierarchical model of decision power would be better than a classic hierarchy.
The official hierarchy does represent decision power, but it is not the only source of decisions in a company. The vast majority of decisions in a company are taken on lower levels and either not passed through any hierarchy at all or only presented through the classical technique of presenting a list of options, all but one of which are unpalatable. In practice specialists prepare most decisions in advance (cloaked in boss-pleasing terms like "advice" or "RFC"), so the non-hierarchical model you speculate about is already there.
Is your take then that the classic official hierarchy is purposeless? That it is simply an obstacle? Or that it cannot be improved upon because it is already optimal for its purpose (whatever that may be)?
I don't think I said any of those. The official hierarchy is a neat compact description of formal lines in an organization, not less but also not more. At most I think any formal org chart represents a vast oversimplification of the intricate graph of relationships that exist between people in any company. There are things about a company it can describe well (like who is in charge of performance reviews for who), and things it cannot describe well (like the nebulous role of individual popularity on company decision-making, or in distinguishing between productivity differences between individuals on the same "level"). It is also almost always limited to the company itself and excludes any factors outside it like competitors or suppliers, despite those factors sometimes being more important to company decisions than its internal organization.
In short, my take is that the "classic" hierarchy is a useful but limited tool. It is not sufficient in the slightest to describe how a company makes decisions, yet too many people treat it as if it is all you need to know.
You didn't, and you didn't leave the impression of having done so; it's just me poking to understand.
> like who is in charge of performance reviews for who
These are the kinds of things I'm trying to challenge: Are performance reviews useful? Would performance reviews actually be more useful in some other structure than a classic hierarchy?
In other words, I'm not convinced beyond all reasonable doubt that the classic hierarchy is the optimal structure for the limited, but useful purpose you describe it having. Obviously the standard thinking is that it is.
> It is not sufficient in the slightest to describe how a company makes decisions, yet too many people treat it as if it is all you need to know.
I've lead a engineering department where we had something more akin to a matrix org. If a team wasn't doing well, "debugging" it was a disaster. I had to talk to several managers, PM, ICs and disentangle a mess of he said/she said feedback.
When tasked with data viz on these lines, one of the early things I ask is what the edge relationship is supposed to represent. Specifically!
Organization / chain of command? Parts information? Charge Codes? Messages/Sentiment? Business Information Systems capture way more data than mahogany row might realize, and you can get some "split the atom" visualizations when you combine the right parameters, like RnD funding + messaging. "Huh. Looks like new tech needs a LOT of communications with the field technicians. Like, a LOT a lot - totally wiping out bandwidth in remote locations"
The data is maybe there but if you model everything at once it's going to be a minimally-significant graphviz blob.
Stupid punchline? Execs hold up the blob as scientific proof that their job is hard. "Don't change it! It's so complex and pretty! I can show this to the VP-Manager of Goofball Systems Inc to validate my existence!"
No, it's not hard or pretty, you just can't tell the difference, conceptually, between a hex driver and a lathe.
“Execs”, a reference to the old fashioned setup of a corridor filled with executive offices that all have impressive mahogany desks. Feels like a British phrase to me but I’m not sure if it actually is.
Organizations do implement matrix structures to greater or lesser degrees which have their own advantages and disadvantages. Typically someone formally reports to one manager but will be "dotted line" into one or more additional people.
It's because there's ultimately one person responsible for company's performance (the CEO), so everybody has to report to him/her through a tree-like structure. A company is a very centralized structure, no different than the army.
"The CEO is ultimately responsible" is just the highest level of the carrot from my post a few levels up. A CEO may seem quite powerful from the inside of a company, but from their perspective they have to deal with competitors, shareholders, suppliers, regulators, and a host of other actors, all of whom have different objectives than the CEO and all of whom can constrain the possible actions of the CEO to varying degrees. Not to mention that many of the managers at or just below CXO-level are highly ambitious people who more likely than not have aspirations to become CEO themselves, so it may be in their interest to do some tactical backstabbing to make the current CEO look bad to the shareholders. All of this adds up to conditions where a CEO definitely cannot do whatever they want, because resources are limited even at this level. Just look at all the failed projects various CEOs at (say) Apple and Google have tried that didn't work even with all the money in the world.
(As a former military officer, this is the same for generals btw. They may have a lot of "power" in the organization itself, but they're heavily constrained by outside factors. They have to make do with the budget they're given, and have very little control over hiring targets etc. Not to mention that during wartime the enemy will not be under control either)
Yeah, of course. The CEO is judged based on how the company under his management is doing in its overall environment (vs what would be the baseline expectations), not just on the absolute numbers.
Perhaps there shouldn't ultimately be one single person responsible for the company's performance. I know the standard way things work, but that's not necessarily the best way.
It's even worse. Even if you made a difference you won't know for certain and you'll mostly hear from people who didn't like what you did. One of the clearest success you can get is avoiding worse shit that would have come to your org. That alone can have a massive impact, many times that of a average IC, but "avoiding worse shit" isn't entirely satisfying. It's just frustrating that it had to be avoided in the first place.
In the more optimistic case, you would need an independent and irreverent labor union with a lot of buy-in and pretense to try to call the shots, in order to do that. In the less optimistic case you would need a social revolution. It's generally more the structure of companies that creates these situations than the individual composition of management.
Assuming one got a Master, 35 translates to 10+ years of work experience.
I kinda understand why. Programming as a craft, the excitement of mastering it, plateaued around 5 years into this profession. Not saying there isn't specific domain that requires many more years, even life long devotion, but for the generic bunch, that would be it. The challenge is going away, and mundaneness of labor kicks in, and you start questioning yourself, what is the point of all this?
For me, I pivoted to some of author's recipes of reading technology history to reignite that romantic aspect, and it does work. Still, I think courage is needed to switch track and makes yourself uncomfortable every now and then, in new fields of tech. I was working in AdTech space, now I am in more pure ML application space, and I am happy I made the jump, and it is pretty rewarding.
Regardless, for fellow engineers, and I would say, to embed the curiosity for ever new technology into your belief system is critical, and it takes time to realize curiosity is indeed a blessing rather than a giving.
Plateauing after 5 years? That seems quite early. Maybe something >15 depending on ones capacity to go further and learn more. There is more stuff out there than one could learn in a lifetime. Many decades of learning. And if one does not know what to learn next, just learn a new language and see how concepts one already knows apply there. Or pick up a book of the masters of our craft and work through it. Like for example one could ask oneself: Have I really worked through all of SICP (or insert other great book here)? If not, maybe there is lots of stuff in there to learn.
Maybe one plateaus after 5y of mainstream every-noun-a-class and endless-design-patterns-forced-in kind of stuff. I guess I would, if I did not look for more elsewhere. I think such kind of job is also why there is a disillusionment. One suddenly realizes, that at the job one might never apply all the cool things one knows. Then it is up to oneself to either find interesting side-projects, or deal with it in some other way, or quit.
Yes, and then write a blog post like OP, with 3 paragraphs about how terrible and boring and evil the industry is, and the next 20 paragraphs an autobiographical history of every computer they ever owned since they were a child.
You know how to recognize a burned out case? The one who's making kombucha, not the one posting on HN claiming they're so burned out.
I think that's a really good observation and it aligns with my own experiences to some extent. I started working as a dev/designer at the age of 15 and have spent almost 20 years in the trade. My career progressed fairly quickly, with my first CTO gig at the age of 24, then moving to founding dev/tech-lead roles, which makes me feel super privileged but at the same time worked too well as a distraction from issues in other parts of my life (PTSD, depression).
The last 5 years have been a struggle, there are days, _weeks_ when I just can't code the simplest thing. The irony is that I have the tools to build most of the things I want, both technical and product related (design, UX, marketing), but now my brain takes 10x time to apply them and it just feels almost physically painful. At first, it was weird and scary.
Therapy and moving from a big tech hub to a small town in a different country helped a lot. But, I feel like I have to re-learn so much because most of my life revolved around IT.
I grew up poor and I'm aware of how privileged this sounds, but that's also one of the reasons for my problems: it's hard to make decisions that are good for you and your loved ones if you keep constantly second guessing/judging yourself/overthinking. There are still days I'm terrified of ending up alone and homeless, although I know, rationally, how much I have and how happy I am in my relationships with people.
You see, for me "pure IT" was never enough. But, it was a good way of creating a constant stream of problems I could solve, then get rewarded for solving them, rinse and repeat... This includes dopamine (I solved "the <small design or CS> problem"!), a sense of progression ("I can solve more difficult problems now"!).
Another issue is that in our trade often it's very hard to see the actual results of our work. I mean, actually, physically interact with people and see their happy faces when we do something for them. At a very basic level, this is something that humans need to keep going. I sometimes mentor or just chat with random people in our trade via "office hours" and this is brought up very often, regardless of age and expertise.
On a positive note, I'm aware that with enough work and patience this will get easier, and will make me a better person. Most of the days I feel happy, more than before. At the age of 25 it was easier to work 80-100 h per week to brute force your problems, instead of slowing down and learning how to live with the ape you are.
I just think that the nature of our work enables an unhealthy pattern of avoiding problems. It's easy to get lost in it.
Also started (paid office work) at 15. In 95. And I'm still obsessed with never forgetting anything. Long after the dopamine of solving a problem and moving on to the next, you go back and look at code and think: This was genius. How does this even work? How did I come up with it and how was my brain working? Much of my old code looks like brainfuck to me. Whenever I don't have anything better to do I pry it open and learn how it worked again. Then suddenly I grasp it and know I'm still good.
I never did it for anything other than solving puzzles for fun, and because being a musician doesn't pay the bills. But the joy of code and music are both more about keeping your brain in shape than anything else.
Yup I definitely fall into that bucket of not really feeling the impact of what I build. It seems like most companies are creating "lesser of two evils" products. Users hate most software. It causes people so much headache because UIs are confusing, bugs occur, simple tasks are painstaking and repetitive. People only use software tools because there's often nothing better or simply because a company forces them to do it. Or maybe it's like, yeah service A is janky and horrible but service B is even worse.
Hah. Yeah, I think that describes most people in the industry to a T. It's rare to get paid to do the fun stuff, where the hours just disappear. A guy can still dream, though.
> At the age of 25 it was easier to work 80-100 h per week to brute force your problems, instead of slowing down and learning how to live with the ape you are
My position is slightly different - I can juggle much more stuff now and I have far more threads running in the brain at the same time. I can see much further ahead and I can do much bigger things right now. But one person cannot code such large stuff no matter how productive he or she is. The only way seems to be working as a team. Or with larger teams.
Could you give me some examples of those cycles? Genuinely curious what you mean.
I started my career just a couple years ago. In my first company, they used 10+ year old tooling and imo it was terrible. A very old legacy mess monolith that made adding features pure torture. Trunk-based development with a "who needs tests" mindset, resulting in horrendously buggy code, 50:50 chance pulling newest version would break something. Several instances of files with over 10k lines of code and deep nesting, absolute nightmare. Absolutely no mindset for performance. They wrote quadratic complexity code in the backend to fetch data because hash-based data structures already seemed too advanced to many of my coworkers and then they wondered why their frontend was so terribly slow. Not that they had a lot of pressure to deliver a great product, because they are / were market leader in their B2B niche. I left within a year.
Now I'm working in a company that uses all the latest gitlab CI/CD shenanigans, code reviews and heavy use of unit, integration and end to end tests. Everything is hosted in the cloud with a microservice architecture. We actually need it as we scale to millions of customers and have performance and reliability requirements.
The difference is not just the tech stack that was horribly outdated and imo extremely tedious to work with, but also the mentality is completely different.
In the first company, you couldn't change anything, there was a strict hierarchy and everything stayed as is because "it works". You totally got the feeling that there's some old people up in the hierarchy that were way too lazy to learn new things and didn't want to endanger their meaning in the company. When I left, I spoke with the Head of HR and he told me that basically all people that leave do so because of the mentioned reasons. So that company drives away motivated talent with their crap mentality. Pay wasn't very good either, but a first job is a first job after all.
Now mind you, both companies are a couple decades old, but imo one always kept up and the other didn't. Both companies have 10+ year seniors. Personally, the people in the current company are way more competent and excited about work. Much more fun to work with, I learn more and I absolutely don't get the feeling the tooling is reinvented in any way. It's improved in all possible aspects I could think of and makes the development experience much better.
I think the newbie coming to the fancy pancy hubernates cluster company in 5 years when you and the other engineers have moved on will have a complete new level of headache inducing mess to deal with compared to what you had at the boring company as a newbie.
I started out my career thinking best practices with agile, code review, CI and "shared ownership" and stuff were the way to go.
But in the end I like the old siloed do-your-stuff way more. It works and gives you actual ownership and freedom. It turns out that it is easier to cooperate when you can say no.
I was thinking the same. I feel like I've seen the cycle with my own eyes at this point. Projects almost always seem fresh and good at the beginning, and then they become monsters after awhile seemingly no matter what you do.
I mean GP could have been at a genuinely bad place with bad practices.
It could also be that he was just a idealistic newbie trying to give advice to hardened experts rightfully ignoring it. HR boss agreeing does not tell us anything. They are buzzword driven.
> Could you give me some examples of those cycles? Genuinely curious what you mean.
A large part of the work surrounding the Docker ecosystem has simply been re-creating features that were already around in the JVM ecosystem 10 years ago. In the same decade, we also had the move from server-rendered webpages, to browser-rendering, back to server-rendering.
OK just to be clear: Even if the constant stack switching can be very tiring, you have to do it. The alternative is deep stagnation, which is much worse. I am also very much pro everything that raises quality like CI, but remember some groups where doing it in the 1970's.
I started programming with basic and then DOS and assembly. I have very fond feelings and deep knowledge from both of them, but the UNIX generation rightly looked down on them as 2 turing tarpit hellholes.
Onward to C, with Mix, Watcom and DJGPP. Better programs, but you live with some stupid inefficiencies that wouldn't fly in x86 asm.
Onward to Win3.1 and Win32. The end user experience is much better, but as a programmer you now have to accept control by the OS over your work. You can't just e.g. write to VRAM anymore. First serious dark clouds appeared for me when I realized Microsoft cynically used us all to extinguish all competition. Politics had entered my IT life.
Then came the web. In one way, it was glorious, but programming it in javascript was a serious hellhole. jQuery brought some sanity, but the user friendlyness from windows was almost impossible to reach.
Serverside was java, which was dog slow until hotspot appeared. It eats memory like there is no tomorrow. Sun dictated the very shape of your program. There was some war going on between EE which was horribly verbose, and Spring which was geassroots and looked down upon by the architects, as if it smoked weed or something. Whatever camp you chose, pain would follow. Ir you could go to the PHP camp and spend more time debugging than programming .
There was some python here in my life, good but dog slower than even Java.
Then nodejs. If you thought Java gobbled up memory, you'd just die working with that abomination. End user usability had still not recovered from the win95 days (it never did). You had no type safety with javascript. In fact, every decent tool and technique was sacrificed on the altar of equal back and front end language. Then came frameworks like angular where v2 managed to commut ecosystem suicide, and react. Meanwhile transpulers packers etc managed to undo much of the noneedtocompile of javascript.
In the mobile world, 2 massive companies appeared, and their app stores killed any liberty of publication.
There us more, but I ran out of time ;-)
All of this is quite ranty, partially deserved but there is also quite a lot of good in here. Even so, programming for me was most fun on DOS, and user experience on win95 to xp.
> I am also very much pro everything that raises quality like CI, but remember some groups where doing it in the 1970's.
CI is risky, because it is a great micromanagement tool, just like ticket systems for non-bug tickets. I don't think it is strategic to lure traps for our selves.
I believe one should have a setup such that "good" management wont mess our stuff up and not being dependent on having "great" management.
It is like agile which only works with great programmers and managers but messes up for most of us. But CI is not nearly as bad or dangerous and have benefits if kept simple.
Let me describe to you a system I've seen myself. I think it was created around 1985, in Cobol, by 1 company, for only that company. Afaik, it succesfully runs today.
At the start, there are screens for what we today call issues: 80x25 terminals that input, edit, prioritize and assign changes. Nightly batches provide management views of what is being done where.
Other screens let you check in and out code files, tracking history, linking to issues and people, and managing which versions are on local dev preprod and prod. Nightly batches run the compiler for changes.
Promotion to preprod requires quality checks, where e.g. no new deprecated calls are allowed. Promotion to prod requires a manager sign off, where the input from the test team is validated.
I have not seen this level of integration until github matured. In some ways, github is superior, in other ways the deep integration with their procedures and tech choices is still superior.
That's more than 3 decades, maybe even 4, that this system paid off. It survived the departure and replacement of a whole generation. It survived all attempts to managerial reorgs, and thank god for that. It came from a time that computers where so expensive that having the manpower and long term vision for this was a good choice, even for only 1 company. Unfortunately, it also makes new people relearn everything they know about version management.
Ye. CI systems can be beutiful. And in some companies you want some sort of formal sign off process. I am not dogmatically against CI.
> It survived all attempts to managerial reorgs, and thank god for that
The problem comes when it is cargo culted and forced I guess.
The temptation for some manager to rewrite the system you describe in Groovy and use Jenkins or integrate it into Jira! Imagine the possibilities of unnecessary work and complexity. A big opportunity cost.
> I started my career just a couple years ago. In my first company, they used 10+ year old tooling and imo it was terrible. A very old legacy mess monolith that made adding features pure torture. Trunk-based development with a "who needs tests" mindset, resulting in horrendously buggy code, 50:50 chance pulling newest version would break something.
Monkey paw curls.
My story is complete opposite. Three years ago joined a startup.
We use relatively new Java, branches everywhere, microservice architecture, +85% branch coverage, integration tests, end-to-end, performance, you name it.
CI/CD integrated and self hosted. Heaven, right?
It was an absolute shit show. Because of microservice architecture you had no way of running +50 necessary microservice on your machine.
Tests are mandated but brittle. Mocking libraries break whenever you refactor a method. Integration tests are flaky and inconsistent (behaves differently on local vs remote).
End to end test takes hours to complete. There are 20 different environments, each with different configuration, each divided into dev/qa/prod.
In how long I was on we didn't have two successful deploys on main branch. But you have to keep adding features because one customer said it might.
Oh security found that library is 20ms too old. Have to replace it asap, despite the convoluted nest of dependencies between microservices.
It had good pay though.
Taught me to really hate mocks and that tests need to be applied at right level.
Microservice is a module. A module that got separated by a network layer, most often due to somebody's momentary lapse of judgement.
It's encouraging that you forbid the next person to fall into identical trap (you effectively say: this kind of remote module must not use further remote modules). Alas... they can, and they will.
> Microservices can't have dependencies between each other otherwise they aren't microservices.
See Hyrum's law:
Put succinctly, the observation is this:
With a sufficient number of users of an API,
it does not matter what you promise in the contract:
all observable behaviors of your system
will be depended on by somebody.
One example we bumped spring from 2.1->2.4 (not actual version numbers) Harmless, no? What's worst that can happen?
Failure when doing some but not all operations.
Why? Because some Python/Java micro-services down the operation chain expected null fields to be omitted and the default behavior was changed (between Spring versions) to write null fields instead. Which only occurred on those services that relied on null fields being omitted. Fix was easy but finding the bug was difficult.
At some point you are going to have another service that uses this HTML->JPEG service though. That would be a dependency, at least in my view (ie, if the HTML->JPEG service goes down, something else will break).
What you are describing at the old company is not a failure of old tools, but rather a failure of management/employee self-management at that company.
Any tool can be used to do good or evil. They were using old tools to do evil things-- namely, writing bad code.
The only caveat here is that if I had to maintain bad bash scripts or bad koobieboobie cicd automated shlalala, I'd always choose bad bash scripts, as the blast radius is smaller and easier to reason about.
> In my first company, they used 10+ year old tooling and imo it was terrible. A very old legacy mess monolith that made adding features pure torture.
Everything becomes like that, legacy, torture, mess. New things comes along, clean, new, solves some problem. Mess dissapears from one place but starts popping out somewhere else but still better than before you think. Wait 10 years and you and you’ve got a completely different mess, lots of people who built it have now left, few know it all but have stopped caring. A new you joins the group. Sees a crazy unweildy legacy system. Sees new technology that solves these problem. Starts over.
>Everything becomes like that, legacy, torture, mess.
>few know it all but have stopped caring
If they kept caring (and were allowed to by being listened at, that's maybe why they stopped), that could have not turned into a mess (I know (of) 15 years old systems that only got better with time, thanks to lead devs playing both as conductors and as musicians).
I think it has to do with how boring the thing you're programming is.
If you're working on some humdrum mobile or web app, it's hard to stay excited. Making apps that clearly do not need to exist takes its toll. I saw the writing on the wall and got out early. For me the solution was to go back to graduate school. I figured that I would be able to find an application area to work in with some real depth and let that drive the programming, instead of working on something dull where I would need to find my excitement in the programming itself. Now that I've done this (I have my PhD by now), the programming I do is quite dull, but the applications are fascinating. What's more, what I work on now has so much more depth to it compared to programming (I do computational math). To attain technical mastery simply requires dramatically more time and effort than programming. I think in this way the proper balance is restored between the tool and its use---a hammer is simple and understandable, building a house is deep and complicated. My hope is that one day people will have more opportunities to use programming for the powerful computational tool that it is, rather than as a means to make a quick buck.
Not at all. As a senior, you note that even if you can do a lot of things - almost everything - right now, you are still just one person. Even if you are a mythical creature that can put out code dozens of times more than others, you note that what you can physically produce is always far behind the possibilities that you can see on the horizon. Then you realize - you need to cooperate and work as teams. For only by collaborating you can make happen all those things that you can envision. Which obviously takes you towards leading people one way or the other.
> A big part for me is that IT just doesn't learn. Every 5 years, a new generation pops up, invents completely new tooling, and makes all the mistakes from the previous generation.
It doesn't learn, because it's chasing a fad or just considers everything before it crap.
As I said, those those that don't understand old systems are doomed to reimplement them.. Badly.
As someone approaching this 35 year limit, I never once had this problem. Jobs are kinda meant to suck. If you want enjoyment, start a hobby.
It's what happens anytime money becomes the point. Call it extrinsic motivation if you prefer. The folks running the company no longer have nor engender a sense of mission; the folks flocking to your career are chasing dollars rather than love or mission. It feels soulless because it is soulless, and if you're one of the few who genuinely care/love, it's crushing.
It's hardly exclusive to programming. My father was a plumber in love with the craft and even the art of it, and by mid 90s the industry had crushed the love right out of him. Happens everywhere.
if you're one of the few who genuinely care/love, it's crushing
There's an upside to disillusionment in that we are stripped of all the ego and false beliefs we hold, as painful as it is. If you're one of the few who genuinely care/love, you'll come to the realizations you did. But the way out of the valley of disillusionment is to realize it doesn't matter how others approach it, because they are irrelevant. You get to do what you love and that's all that matters.
Only the creator will ever truly understand the labor and love that goes into his creation, to expect that level of appreciation or care from others is unwise and irrational. In fact, holding that expectation might even reveal hidden motives that imply one's "love for the craft" is not as genuine as they'd think.
The problem as I see it is that you have to care because the people who are just in it for a paycheck are on your team. Your boss is going to want to know why you aren't as fast as the developers who cut corners in order to get features delivered.
One of the things not mentioned here is that doctors are genuine professionals.
Software developers, on the other hand, are in this weird limbo where some consider themselves as professionals and others do not. I am using the term "professional" in the wikipedia sense https://en.wikipedia.org/wiki/Professional.
Similarly, some organizations treat software engineers as professionals, and other consider them closer to manual labor.
The result is there is no shared understanding what it means to be a software engineer. Therefore it's quite hard to discuss software engineers as a single body of professionals, as the loose definition of the field means there won't be a single definition everyone could fall into.
What I read from the article is that the author defines his "professionalism" as "inhabiting the monastery". That is fair and good, but there is no reason this definition and feeling should be shared with anyone else in the field. This lack of shared definition probably drives a lot of disillusionment.
Someone thinks they are entering a monastery, but the next guy is there just to sweep the floors for a living.
In monasteries most people are there to search for the higher purpose. But in software industry, "sweeping floors" is the most plausible job description, where no holy insight is expected to be gained.
The described disillusionment might not be because the business is pathological (which it may very well) but because the practitioner entered the field with false expectations, imagining perhaps grand technological projects to move humanity forward, but founding themselves in the "sweeping factory".
Another thing that comes to my mind - generally once you've been in the industry for a decade or so you are middle aged, and probably entering your middle age crisis where steretypically people re-evaluate their priorities and life goals. Which, given generally sofware people are not the dumbest people around, means there are lot of displeased people trying to search more meaningful careers, and given their general aptitude in complex tasks, can find lot of other fields (that don't need years of complex studioes) where they are quite good as well.
Most of the conflicts I have had in my career were due to this misunderstanding. Someone in a manager position telling me to sweep the floor in a certain way and me telling them that it doesn't make any sense to even sweep that floor since we are getting new hardwoods installed tomorrow.
From their perspective I'm supposed to do what they said exactly as they said it, and from my perspective they are clearly not qualified to be making decisions on how the floor should be swept.
The IEEE Computer Society has been attempting for years to "professionalize" the software development field with a defined body of knowledge (SWEBOK), code of ethics, education and experience requirements, and certifications. It's a noble goal but so far no one really cares.
I think a lot of people just don't understand the terminology. One can be a professional software developer, but for the most part software development isn't a true profession. It's more like a trade. There is a difference.
The problem in my eyes is that we're often not allowed to be professionals.
Being a software developer can often feel little different from being in a high school programming class. Paradoxically, this feeling can have a positive correlation with years of experience. Looking back, I should have cherished my time as a junior developer because, frankly, I got to do more interesting things, write more original code, and actually have more say as to how work was going to be accomplished. By the time of being a high-tier mid-level or a senior engineer, the work not only becomes less interesting but is entrenched in a company's way of doing things. Somehow, senior engineers find themselves really not having as much power or influence over what they're working on. Hell, even lead developers often have limited power and are more or less supervisors with the responsibility of writing code. While the senior engineer's non-programmer friends get their own office or cube, he's sitting at a "shared" desk that's more or less a glorified high school computer lab with no personal touches.
Few software developers manage to distinguish themselves or obtain a semblance of prestige. Those who do usually don't entirely deserve it, in my opinion. Software developers with name recognition today either are in the business of shilling books or founded a company that makes some parasitic social media product that makes the world a worse place. Or they're promoting some paradigm as gospel. Some exceptions include language and framework inventors; I think people have generally good opinions of Guido van Rossum, DHH, Ryan Dahl, Rich Harris, and so on. These people are still largely nobodies outside of software developer circles, despite the impact their creations have had on the world as a whole and on the careers of countless others.
I think a major part of the disillusionment that goes unsaid is that "tech" today is far more associated with cyber dystopias than any sort of positive view of the future where software and humanity work together to solve real problems. This isn't to say the pessimistic view is a new one, because it's been around since time immemorial, but I do think that the institution of Silicon Valley isn't seen as favorable by the mainstream as it used to. When I got into software full time, programming as a career was in a bit of a renaissance after the recovery from 2008 and the easy money started really flowing. Joining an "innovative" startup was considered really cool, even by people who weren't programmers. Today, working in software and being a part of this industry is much closer to being seen as selling one's soul to the devil. When people think "tech", they think of cryptocurrency scams, Mark Zuckerberg, useful features disappearing on sites like YouTube, devices like Echo/Alexa that fail to live up to their promise, apps like TikTok warping the minds of young people, Tesla cars with their false advertising and shoddy production quality, the increasing prevalence of ads on streaming services... the list goes on and on. It all comes off as a giant swindle.
The health care industry is a swindle too, but it still solves life and death problems. Most "tech" doesn't, but is a psy-op to dupe people out of their time and money. Doctors and surgeons don't completely change their toolset every 5 years because of the opinions of other doctors. Gee, I can't imagine why frustrated doctors stay in their field while programmers throw in the towel so often.
Finally, I do think there's an unshakable taintedness to being a programmer while there's simultaneously an unshakable romanticism around being a doctor, or a lawyer, or even a starving artist. Programmers invest a ton of time and effort into learning their craft, and some end up sacrificing their social lives in their 20s with the belief that they'll really start living in their 30s after they've made a lot of money and have that senior title. They come to find that half their paycheck goes to the state, the women they're romantically interested in think programmers are "eww grody", and they've become pseudoautistic from a lack of healthy social interaction. A woman programmer may feel even more disillusioned because of biological reasons, and without the sort of respect they'd have had if they went to medical school.
Some part of you has to love the field of software in order to stay in it. Unless you make beyond the average six-figure paycheck, you probably won't find fulfillment from anywhere but within.
>world of corporate politics, bureaucracy, envy and greed— a world so depressing, that many people quit in frustration, never to come back.
I think one reason people get disillusioned is not simply because those things are depressing, but because those things exist at the same time that there's a strong disconnect between what's the right thing to do and what will bring the company most _money_.
The depressing part is having to go through all of these hoops and realize that it was all for nothing, because the end result is either not what the user truly needs or outright evil. I just want to build things that helps others.
The evil thing is what upsets me the most, so much of technology appears to have a truly sinister side nowadays. Perhaps it always did and Im only now seeing it.
• Microsoft made it impossible to install Windows 11 Home Edition without a Microsoft online account. I know of no FOSS with the nerve to try something like that.
• Non-Free games that charge real money for in-game cheats, and are of course designed to prevent you from manipulating your own game-state
• Mobile apps that request clearly unnecessary permissions, for reasons never revealed
• Mobile apps that sell your location data, and any other data they can get their hands on, with minimal regard for how this might impact your physical security
• Lies of omission about fixes to security flaws, and their specifics
• Intrusive telemetry that can't be disabled (although FOSS doesn't offer a total guarantee against this, see Firefox)
• Lies about security properties that are hard to verify without access to source code, such as falsely claiming proper end-to-end encryption
Deceptive marketing/monetisation, as in using psychological tricks to get people to spend money on things they don't want or need. Basically how every modern app and game works nowadays[0].
I left software engineering after about 10 years to become a patent attorney. I'm about 3 months in (post-law school).
I left software engineering because I just couldn't see myself doing it for another 20 years, especially if it was going to involve even more meetings (ie. management). I chose patent law because it seemed tech-adjacent.
I'm in biglaw now. Too early to say if this was a wise move or not, but I do find myself fondly reminiscing about my cushy life as a SWE often, already. I have work much, much harder as a patent attorney and have much less free time and time off. Yes I get paid more, too. This is partially a function of biglaw, and partially a function of the legal world in general that is tethered to the billable hour.
I guess my point is to take careful stock of the lifestyle being a SWE affords you relative to the workload and compensation before making any drastic moves.
Funny, I did exactly the same. Bought 486 off the ebay, got me some Turbo C and bunch of old books on computer graphics. It's very satisfying feeling vs even working on some hobby ML projects or some other new web fancy thing.
My day job(web dev) is like everyday the same house is flooded with sewage and I need to clean it quickly and also put a smile on my face and tell how great is that and that I managed to do it all today and suggest new ways of cleaning sewage better.
I'm in the same boat, that I have contemplated starting a business making houmous as a street food vendor.
The author's article does resonate with me but to add, I think the software game has changed a lot since the dot-com era.
It has changed for worse, we are disconnected from the hardware and the users.
We are just middleware integration specialists, depending on an ever decreasing amount of pioneers building systems, frameworks or low level processes.
I want to build Dijkstras algorithms not write integration tests for APIs made terribly by Stripe
It was always so? It is so in every single technology: A layer is built and its complexities are solved, enabling the people and society to move to the next level and build another layer on it. With every step, more layers are put in between the lowest level and where we are. But every layer pushes us up one level more, allowing us to do things that were unimaginable before.
Technology has always been middleware. Somewhere someone discovers a pioneering, new implementation or tech. The rest of the process is doing middleware to bring that tech to the people.
> My dad introduced me to the genre with Jules Verne's The Mysterious Island, in which a team of five end up on an uninhabited island, and use their knowledge and ingenuity to rebuild a technological civilization from scratch.
I used to read quite a lot when I was in my early teens and this is one od two books that I vividly remember (other being Around the world in 80 days).
I sincerly believe that this idea of rebuilding civilization from grounds up was responsible for me getting into STEM and becoming who I am now.
For me this manifests as changing jobs about every two years.
I also keep a Fred Brooks quote taped to the wall behind my monitors:
The programmer, like the poet, works only slightly removed from pure thought-stuff. They build their castles in the air, from air, creating by exertion of the imagination. Few media of creation are so flexible, so easy to polish and rework, so readily capable of realizing grand conceptual structures.
The magic of myth and legend has come true in our time.
Fred Brooks, The Mythical Man Month
It doesn't fix alienation, but it helps. I particularly like to share it during interviews to get the vibe of the people I'm talking to and potentially going to be working with.
If you like programming like a craft or something almost magic, then work on something like that. There are jobs where the ratio of algorithms and problem solving to plumbing is perfectly fine. You don’t have to work with apis and databases and and yaml.
I think too many get into software because of that magic feeling but then end up doing plumbing. Of course that’s not going to be as fulfilling as what you did on your ZX. But people plumb along or think that maybe it’s just time to go into management, they feel “done” with development. And that’s sad.
I believe that the more fundamental issue is that too many devs are 1) working on the wrong thing and 2) too disconnected to the users of their product, especially in larger companies.
1) I work at an early stage startup, and besides the tech challenges, what brings me most joy is our mission and vision of the product: to reduce the amount of time healthcare professionals are stuck doing administrative work, in turn allowing them to more effectively deliver care. I admit I was lucky to find a place building a product I can care about, and that it's a privilege. But for the love of god, please (try to) work on something you actually care about. For me that means building something that adds value to society, instead of e.g. trying to make [big corp] more money by ad optimization. I don't think I would feel fulfilled working on something like that, even if the engineering challenges of working on such a project could be great and fun.
2) The author mentions that doctors avoid leaving the sector, because they can see the impact they're having on their patients. It seems to me he missed the obvious analogy to us developers and the users of our software. If you never engage with the users of your product (which I do believe happens a lot, especially at larger companies), how will you know what value and joy it brings them? Staying close to the user, trying to understand them, is imo one of the most important (and fulfilling) things we can do as developers. It also allows us to do a better job, not in the least as a result of actually caring how your software will affect people.
TL;DR try to work on something you actually care about
Regarding your joy in working in tech, what types of administrative work can technology reduce for healthcare professionals? I find it depressing to see my general practitioner in front of a computer, typing in stuff, taking more time than it used to with paper.
Thanks for the interest! The things you mention are exactly what we are looking to solve! Basically we are doing speech recognition specifically for the healthcare domain, so that for instance your GP or nurse can just report by voice instead of typing. It can really save them a lot of time that they can then spend to provide better care.
Also, electronic health record systems (EHRs) like Epic generally have horrible UX, e.g. navigating to a specific page might take 6+ clicks. We'd like to improve that by for example enabling navigation by voice, filling in forms by voice, etc.
This seems like a fantastic idea. Voice assistant which could replace mundane navigation through web interface seems would be desirable for a greater good. Good luck with execution!
This is surprising to hear. From my perspective as accomplices scientist being a doctor strikes me as one of the few jobs where you are obviously and directly helping people, thus “making the world a better place” (as we CS folks sometimes like to think we do).
That probably depends heavily on the field and even then, the endless flow of new patients can make it seem hopeless. Sure you patched up two guys today, but there will be five more tomorrow and every day after that.
It's even worse if the patients in question don't just suffer from "unjust" illnesses that just happened by accident but are instead stubborn alcoholics or something like that. You can treat someone for the symptoms of liver failure, but you already know they'll be in again next week because they can't leave the bottle alone. And when this patient dies there will be hundreds more next week.
When my grandfather was in the hospital I got to talking with the patient in the next room, who was recovering from a quadruple heart bypass operation. He didn't like the hospital food, so he convinced his family to bring him a double cheeseburger and fries from the fast-food restaurant down the street. That's got to be a little disillusioning for doctors.
I have a few friends who are doctors. One is a primary care physician and he talks about the frustrations of patients refusing to take care of their health or adopt behavioral changes he recommends. Most just want a prescription so that they can continue their bad habits. The other said he went into surgery because he didn’t want to have to work with patients who won’t take care of themselves. However, he occasionally looses patients on the operating table which never gets easier. As an undergraduate I considered going into academia, and one professor cautioned that I need to be prepared to see the ugly side of academics and be willing to tolerate seeing something I love be sullied by inter-departmental politics, publish or perish, difficult students, etc. In short, I think all professions have their upsides and downsides. I think the answer is to care about your craft for your self respect. I think you have to do your best ultimately for yourself. We’re putting a lot of our time/life into our craft, so it seems to me we owe it to ourselves to make it mean something, regardless of the reality of coworkers, the need for a paycheck, etc. I will also admit that living up to such a challenge is a daily struggle, and I don’t always measure up to my own ideals. But I’m always trying.
I think disillusionment in software development is mainly about the social interactions at companies rather than about the technology itself. Sure, all these hype cycles are annoying and if you believe in them and they turn out bad. But they are only truly annoying if you are forced into them too early by the usual dogmatic evangelists using social shaming, management force or what not.
But in the end the old tech is still there for you.
> Yet I haven't heard of a single doctor who quit his practice and moved to Colorado to run a ski lodge. When I ask why, they all give the same response: the patients. Every day they see patients brought back to health, the bullshit recedes into the background, and they're reminded why they got into medicine.
That kind of actual positive impact isn't the norm in tech.
I was so lucky as to become the TL of the Code Search UI at Google less than a year after I joined. While there were certainly frustrations, one thing that was wonderful was meeting random SWEs and have them go "Code Search is awesome!" That's the equivalent of meeting the patient. Getting regular reminders of how what you do actually helps real people is a big booster. Certainly difficult to do for some programming jobs, but if that's the case, maybe ask yourself if you're working on the right part.
maybe dentists wouldn't be lied to if they could actually tell if you floss or not. i always start flossing 2 weeks before my biannual cleaning and consistently receive glowing reviews from dentists on my dental hygiene.
I got enormously disillusioned with Wikipedia. Then I decided to contribute to the Women in Red project. All my disillusionment went away! I wrote articles about Australian women - my last one being Kate Baker who championed the first truly quintessential Australian writer, Joseph Furphy.[1]
Then I got banned and all of that came to an abrupt end.
Now I’m using Wikishootme to document South-West Sydney as it’s unloved by politicians and those from the East and North Shore of Sydney. I’ve discovered the area I live in through the act of taking photos of the place I live in. It’s fantastic again!
I like to work closely with the users of the software I’m working on. I volunteer to attend their meetings. I ask them how new features are working. I solicit feedback. It’s very rewarding even if sometimes it can be uncomfortable when things aren’t going well.
Wonderfully written, I think most software developers born in the 70s-90s started like that. I'm not sure I understand the advice at the end. If one becomes so cynical as to consider pursuing manual labor instead of their life passion, wouldn't all manner of reading, hobbies and hacking end up in the "what's the point" hole? What's the point in coding a game of Pong that will end up on a shelf being used by nobody? What the point of being inspired by an old book if you're still writing CRUDs?
The AI revolution can't come soon enough. We'll all be out of a job, but perhaps we'll all be better off.
It can't come soon enough. I can't escape this irritating boring profession because of family responsibilities. If I am forced out I won't have to shoulder the guilt.
I got around this feeling of disillusionment by changing track to vulnerability research and exploit development. Now I spend my days figuring out how things work and how to break them to do things they were never designed to do.
It's so much more satisfying that the tedious, unfulfilling, run-of-the-mill software development crap that I was doing prior to this. The nice thing about this switch is that all my previous experience in the field comes in very useful, unlike if I changed careers to something entirely different.
> One reality is the atmosphere of new technology, its incredible power to transform the human condition, the joy of the art of doing science and engineering, the trials of the creative process, the romance of the frontier. The other reality is the frustration and drudgery of operating in a world of corporate politics, bureaucracy, envy and greed
To any programmers feeling similarly disillusioned with the "world of corporate politics": there is a war going on (the War to Liberate Ideas). Join the fight.
Lieutenant, if you go to that war, arm yourself with a more googlable name.
The phrase "War to Liberate Ideas" gets you nothing on google/bing/ddg, with or without quotes. I think I know what you are referring to, but others don't.
In the USA, it's to end copyright (Intellectual Freedom Amendment - https://breckyunits.com/the-intellectual-freedom-amendment.h... - basically freedom to publish) and censorship. In other countries, it's those things as well as more basic things (freedom of speech and the press).
>The internet desensitized me to text and video. Anything that happens in front of a screen doesn't help with perspective anymore. But I discovered that getting away from the screen and handling physical items does.
A little late on this one, but I've been recapping and fixing an old SE/30 I've had since '98, and it's been rewarding to do. I've also connected with old friends while doing it. Doing things in meatspace that are tied to computing has been really fun and running System 7.5.5 on an SE/30 also reminds me of how far things have progressed from back then while still some things seem timeless. It used to run NetBSD and run as a server ages ago.
I also collect Roman and Ancient Greek coins (there are cheap ones), have a Roman emperor list filled out, and have a lot of lovely Greek art I've picked up over the decades. I just started a new Byzantium & the Middle Ages building out an eastern Roman emperor list. Holding each coin and learning about the era depicted and the people of that time is fascinating for me. Won't save you from burnout over tech, but it's always good to have hobbies that keep your spirits up.
Just don't make your life revolve only around software development. Try to limit at the necessary extra hours, knowing that yes, they may boost your career for a while, but they both diminish your work quality and get you closer to burnout. In your free time, try to do other activities. Reading, fishing, hiking, learning a new language (not programming) through books and not the internet. That's what I do if anything.
Wow, I read “Once you observe the darker side of human nature in the technology industry, you cannot forget or unsee it.” And then I hit the (skippable) paywall.
I laughed, goddamn this is funny. An article about how exhausting this shit is and there’s fucking dark patterns shitting all over it. It’s so perfect, like a grand satire. Performance art done in software, but fully unintended and shockingly cynical.
If being in tech requires crafting your own rituals in order to stay engaged, what's the motivation to not simply quit and start brewing beer and growing tomatoes while still keeping your own tech rituals? If the fulfillment of working with technology is coming from outside having technology as a career, why stay in technology as a career? Compared to the feeling mentioned between doctors and patients I still don't see any actual compelling reason that staying in tech as a business in its current state is worth it even to the most enthused about technology.
At least for me, and I am going to guess the majority of us, the comfort of a good paycheck outweighs the disillusionment, despite the result being a feeling that you're wasting 1/3 of your waking days on something you don't care about.
Every time I look at some flashy purchase that I'd like and can get with finance (e.g. new car), it always seems affordable, but the long term cost is that I have to rigidly stay in my income bracket or rise above it. There's no room for risk to fail.
>Yet I haven't heard of a single doctor who quit his practice and moved to Colorado to run a ski lodge. When I ask why, they all give the same response: the patients. Every day they see patients brought back to health, the bullshit recedes into the background, and they're reminded why they got into medicine.
>The default in engineering is different. We don't have a daily ritual built into our jobs that reminds us why we got into the field.
This article misses that for most software engineers there is no monastery. Unlike a doctor who actually is doing the thing they care about, healing patients. Most software engineers aren't doing anything related to expanding technology or even using technology in a skilled way. It's just an empty race to the bottom.
The whole setup of expectations in tech is contrary to happiness for most people. The ideal of 'building the future' and other silly values such as 'change the world' are without content. Build which future? Change the world how? These empty values cannot hide the inherent meaningless of most tech jobs.
Moreover, the pace of innovation in tech can burn people out for sure; trying to upskill all the time is no way to live as your brain changes and family life competes for time and attention. But more than that, the constant mini-revolutions in tech shorten history and reveal to anyone who cares to look the graveyard of innovation in miniature. It takes a strong will to consider without flinching that years of effort, success, and failure were transient curiosities to the world.
In a tech career you must plan for your own obsolescence unless you move into the 'stable' trajectory of management. That's rough on the ego of any healthy person. That's not even to mention the challenges of soulless bureaucracies that constitute most of the job options these days. The solution is to find sources of power and leverage for oneself. This is admittedly hard to do in management-dominated jobs unless you also have the skills for management. If you have management skills, you can avoid burnout by avoiding notions of 'fixing organizations' or 'improving processes'. These are dead ends. Find opportunities of expanding the things you control and minimizing the things that control you. If you are a technician who is loath to become a manager, understand you also need to maximize your leverage. This comes with control through skill and ownership, not skill alone. People who want to see this advice as cynical are narrow-minded. Nothing about having leverage means you can't help people succeed, or be a team player, or be a 'good' person. In fact, unless you are a sociopath, you must have friends and allies and be committed to their well-being to be happy. What it does mean is you having the power to do all those things and not be helpless. Helplessness is the mind killer. The whole 'beginner mind' business is, paradoxically, a hack to put yourself back into an expansive state of power (the opposite of powerless). But it cannot last. Unless you are a happy hermit you will want to be in the world, with people, with organizations, with politics, with friends, with enemies, with family. Technical competence, among its many uses, is a kind of armor you put on to conquer, but it isn't terribly useful to hold on to what you've gained for long.
> setup of expectations in tech
In my experience, people who plaster 'building the future' shit on every wall aren't engineers but HR department that is happy to exploit that unique amount of energy that comes from combining twenty eyer old people with no life and vague idealistic promises.
Zero Credibility is a weekly column of essays ... on engineering the future
HR marketing pitches to attract talent by suggesting the prospect of important work are used because they work. Especially with millennials. Maybe the newest hires are mercenary, transactional zoomers and we can finally put the aspirational mission statements to bed.
It's all matter of mindset, changing work and challenges often, learning something new and stay away from all non-core discussions, specially politics.
I think you're right. Another big aspect is the why behind getting into the industry. Myself, I ended up getting here entirely by accident. Ended up finding that I was good at it, and good at selling myself as a developer to employers.
But the reason I got into this wasn't because I had some idealistic reason behind it. I got into this because it provides a job that gives you the salary, mobility, and time to yourself that lets you do anything you want when you walk out of the office.
I love my job, I like doing software development, but for me it's never been about the ideals behind it. It's always been about what being a developer lets me do with my life.
Looking at it that way, there's really nothing to be disillusioned by.
i think at least people that learned it before 2000, they wanted to be computer scientists. Then after 2000, a lot of people just decided to be software developer because of the money. I belong to the first group. It makes me happy to stick to the bits and bytes and the hacker mentality that I had in the 90s. In another hand my wife is teacher and she has more reason to be disillusioned by her job than me :)
Ha. If money has little value to you, if you value only meaning, then there is nothing anyone can do to help you be happy in this profession. If you want to challenge yourself intellectually beyond the minutia you are out of luck. If you want respect you are out of luck. If you want power, you are out of luck. It's a dead end profession and hopefully it will be automated away soon with LLMs.
I'm 41 and been in the industry since early 20s. There is no intellectual challenge here, because there is no art to what you describe, just man hours.
- I think part of this is related to this era. even MIT reopened glass smelting fabs. People need more stimuli than digital projections inside chips
- All we really want is to contribute to others. Do something nice that is never gonna be of use to anybody and you'll feel down, do something super trivial that gives someone his smile's back and you're proud of your day.
I don’t think professional software development is worthy of being compared to monasticism. You shovel the shit and get paid. Nothing wrong with that, but it is what it is. No use pretending otherwise.
There are so many other things to pursue in life besides computers. It’s ok to leave computers behind. It’s not a sad thing, it’s a happy thing. If it’s time to move on, realizing that is healthy and good. You don’t have to hate it to move on, just decide that there’s more to life.
There's shoveling the shit and riding the horses, and then there's just shoveling the shit and getting paid. These are normally considered two very different jobs, but maybe software development conflates them more than horse training does?
What do I do if I am not content shoveling shit? I'm exceptionally bright, like many others in the field. When I chose it I did not expect it to be like this. The disrespect, the monotonous work, the hyper analytical and socially oblivious coworkers. The complete lack of promotion opportunities. I'd quit to start an unrelated business, but the financials just don't work out. I'd rather be working the fields.
It doesn't have to be like that. I'm not saying everyone is going to find their dream job, but it is definitely possible. If you're in love with software development, as I am, there are just so many different sub-fields and niches with so many different kinds of work.
There's scientific computing, embedded work, game development, tool development, industrial stuff, robotics, UAVs, software work in any number of tech-heavy industries that might themselves be interesting - just to speak of things I personally might find of interest. There's even Web development and enterprise software. And a zillion other kinds. All of which will have their own differing mix of problem type and day-to-day work and just "look and feel".
My current work means solving complex issues with a high degree of autonomy, new projects arriving at a comfortable pace, never the same thing twice (but enough similarity in projects not to be starting from zero all the time), a decent pace of new technologies to learn but without pointless hamsterwheeling, and so forth. We have genuinely good folks on the team with a good mix of experience levels, and great management who view their job as clearing obstacles from our path so we can do our thing. In short, I love where I'm at and what I do. Which has not always been the case in my career.
For a different person, my life at work would probably be an agonizing slog into burnout, or maybe just pure sheer boredom. I dunno. But the point is there are just so many different work situations available out there - many of them nothing like the ones you hear about on tech sites - and it's possible to find a different work situation that's a lot closer to what you need to thrive.
You can choose a life of enjoyment or you can choose a life of making money and being annoyed all the time. Pick one. If you can accept it’ll often be annoying it gets a lot easier. You can job hop to make more money and get more experiences but it’s all the same BS anywhere you go. Even if you create your own company and hire people then the same sort of stuff happens too. Can’t really escape it.
"When you wake up in the morning, tell yourself: The people I deal with today will be meddling, ungrateful, arrogant, dishonest, jealous and surly." — MA
(NB. dude was Emperor of Rome and couldn't really escape it)
You can't really just choose a life of enjoyment unless you have the financial means to support yourself in some realistic way. Most fun and enjoyable work is poorly paid because there is an army of people out there who will do it for free/as a hobby/for the clout.
There is something magical in watching the software you wrote sling full size wood logs around (industrial automation), control and maintain a positive pressure in painting chamber, take off and fly away somewhere.
And even the greater cost of failures/mistakes or the insane hours to debug a single bit set incorrect somewhere deep in SoCs RAM controller is not enough to offset it- you learn so much about the way world works (or doesn't- but that's something to learn as well).
An off-by one error crashing and snapping conveyor belts, too large a P coefficient in pressure controller bending the doors, missing the requirement of closing the shutters for an outdoor heat exchanger leading to freezing and bursting it- though frustrating at the time leads to much better memories than debugging a PHP file deployed on a VPS with logging disabled.