This is why software devs are not professionals. A professional engineer will not sign off a bridge that he knows is liable to collapse. Software devs will build whatever dangerous immoral garbage their boss tells them to, and then rationalize it to themselves.
A professional has an obligation to a code of professional ethics that supercedes loyalty to their employer. Nothing of the sort exists in software.
A licensed engineer who signs off on a bridge that collapses will not remain an engineer, and may be open to criminal prosecution. Their employer knows that, and therefore doesn’t ask them to make that choice. In the rare cases where they do, the engineer doesn’t end up blacklisted across the industry for saying no.
You're mixing up the words engineer and professional.
A professional can still be a mere subordinate who just follows orders.
I don't know why it's so poulpular to conflate the word engineer and developer to the point where simonw decided to drop the most important word "software" and started calling AI assisted software development "agentic engineering" which is the most absurd oxymoron you can come up with.
The person prompting for code is delegating the majority of decision making to the AI. This is the antithesis of engineering. Hence the operator cannot be the "engineer", at best the AI can be the "engineer", if it is smart enough.
The word engineering implies a task with trade offs, guarantees and expectations about the finished product. The vast majority of software isn't important enough to even know what the specifications are or what features it should have ahead of time. You throw something at the wall and see what sticks. "Agentic engineering" just accelerates the process of throwing things onto the market.
Then there is the fact that "engineering" has become a euphemism for software and nothing else. Anything physical is excluded from the start.
Finally "agentic engineering" implies that you're engineering the agent, but you're not doing that either. You're just a user who set up a sandbox and is letting the AI loose.
Engineers are only one type of professional: doctors, lawyers and accountants are also professionals who have obligations to their profession before their obligation to their employer.
The title 'software developer' is correct. We are not engineers and we are not professionals. Pretending otherwise is a grasp for unearned status.
I believe software developers don't have any kind of paperwork to be considered professionals. Professionalism is a kind of attitude to begin with and can be tied to your conscience and moral compass.
Any paperwork certifying this is just a label and external anchor. In essence, it starts from within.
Some skilled jobs are called professions, and this special label indicates that they are more than merely jobs. The professions involve a special set of services to society, and people who join these professions are expected to uphold high ethical standards in their professional conduct.
Although many professionals have their own private businesses, and many others are employed by businesses, these occupations are misunderstood if we equate them with businesses. In fact, the growing tendency to treat professional work as nothing more than a paid jobs is the source of many ethical challenges.
- The primary goal of a business is to make a profit for the owners, and it does this by providing some product or service to customers.
- The basic purpose of a profession is to provide a service to the community. Here, the primary goal is societal well-being, not profit.
This distinction reflects a higher level of responsibility, expertise, and commitment to maintaining standards that sets professions apart from other occupations.
That's what happens when you're so untrustworthy as Sam Altman. This is coming to roost now. Ever since Paul Graham even said he'll get whatever he wants by any means necessary, basically. And then his board turns on him.
Something I don't quite understand is why software developers are flocking to CC - product of a company whose CEO literally said they will be out of job because of it.
I am a software developer that believes LLMs will ruin the market for us in terms of destroying a lot (not all, but more than enough to be devestating) jobs and depressing wages, but I still have no real option but to use them to remain relevant for now.
That said, I dont have a horse in the race and have used all the available options and find it very easy to switch among them, so I'm not a specific booster of Claude Code or any other option.
ostrich, meet sand. I mean, have you tried CC? It's fun to build stuff with it. I think developers that don't use it (or something like it) will be out of a job, yes (unless they're very niche).
Like Jensen Huang said the job of an engineer is to solve problems not writing code. The code is a means to an end. This is certainly true for me as an engineer and why I'm not worried about AI.
Because fear sells easily to those who do not know the future.
Dario constantly fearmongers to them and 98% of software developers all fall for it, and he needs to sell you access to his product.
The future is local LLMs and Dario knows this is their main threat. Just as Cursor, they don't want to pay for Claude anymore and are going with local models.
It’s just because ChatGPT is worse than Claude as of today. If that flipped, which is probably will at some point, if only temporarily, OpenAI will be back on top. Unfortunately it has nothing to do with Altman’s morals.
What are you actually claiming? Your take on their looks is irrelevant, and I don't know if you're hinting at anything specific when you say "People don't know who Dario is yet".
Yes, and France has a greater than life +70 for some works.
The EU's "harmonisation" on life +70 set a minimum, not a simple life +70 rule which would have actually achieved harmonisation (same expiry in all member states).
Correct. It's abusive to apply probabilistic AI models to real human individuals and penalize them for things they haven't even done yet with no recourse.
I hope you will remember this the next time your employer asks you to build an AI moderation, credit evaluation, or anti-fraud system that will harm much larger numbers of innocent people far than one mean website.
there is certainly a future where this isn't the case. Learning how to use AI and use it in your workflows will likely for sure be a part of any serious dev's future, but being beholden to a data center does not seem to reflect reality. Consider all the 5m-8m models and how powerful they are today compared to what the best models did 2 years ago. If you want to stay absolute bleeding edge model wise, sure you'll be stuck at a data center for some time...
Why isn't this just kinda seen as a repeat of the original birth of computers? Consider the IBM 350 (3.5mb) rented in the 50s for thousands per month. Now I have a drawer filled with SD cards that go up to 128gb that i cant even give away.
> any company ingesting that much cash needs to justify its capacity to survive.
What, why? There are tons of low-margin capex-intensive business out there.
I think AI will end up like being like hosting. All the models will converge to being pretty-decent and the companies will have to compete on efficiency since they are selling a generic commodity.
You can already see Anthropic fears this scenario since they try so hard to make people use their first-party tools rather than plugging Claude in as a generic part of a third-party stack.
A professional has an obligation to a code of professional ethics that supercedes loyalty to their employer. Nothing of the sort exists in software.
reply