This article claims to be something every developer must know, but it's a discussion of how GPUs are used in AI. Most developers are not AI developers, nor do they interact with AI or use GPUs directly. Not to mention the fact that this articles barely mentions 3d graphics at all, the reason gpus exist
One can benefit from knowing fundamentals of an adjacent field, especially something as broadly applicable as machine learning.
- You might want to use some ML in the project you are assigned next month
- It can help collaborating with someone who tackles that aspect of a project
- Fundamental knowledge helps you understand the "AI" stuff being marketed to your manager
The "I don't need this adjacent field" mentality feels familiar from schools I went to: first I did system administration where my classmates didn't care about programming because they felt like they didn't understand it anyway and they would never need it (scripting, anyone?); then I switched to a software development school where, guess what, the kids couldn't care about networking and they'd never need it anyway. I don't understand it, to me it's both interesting, but more practically: fast-forward five years and the term devops became popular in job ads.
The article is 1500 words at a rough count. Average reading speed is 250wpm, but for studying something, let's assume half of that: 1500/125 = 12 minutes of your time. Perhaps you toy around with it a little, run the code samples, and spend two hours learning. That's not a huge time investment. Assuming this is a good starting guide in the first place.
The objection isn't to the notion that "One can benefit from knowing fundamentals of an adjacent field". It's that this is "The bare minimum every developer must know". That's a much, much stronger claim.
I've come to see this sort of clickbait headline as playing on the prevalence of imposter-syndrome insecurity among devs, and try to ignore them on general principle.
Fair enough! I can kind of see the point that, if every developer knew some basics, it would help them make good decisions about their own projects, even if the answer is "no, this doesn't need ML". On the other hand, you're of course right that if you don't use ML, then it's clearly not something you "must" know to do your job well.
I remember how I joined a startup after working for a traditional embedded shop and a colleague made (friendly) fun of me for not knowing how to use curl to post a JSON request. I learned a lot since then about backend, frontend and infrastructure despite still being an embedded developer. It seems likely that people all around the industry will be in a similar position when it comes to AI in the next years.
Most AI work will just be APIs provided by your cloud provider in less than 2 years. Understanding what's going on under the hood isn't going to be that common, maybe the AI equivalent of "use explain analyze, optimize indexes" will be what passes for (engineering, not scientist) AI expert around that time.
Most things provided by your cloud cloud provider are just slightly modified and pre-packaged versions of software you can run anyway. Postgres on EC2 is a perfectly viable alternative to whatever Amazon offers.
Not to mention their passing example of Mandelbrot set rendering only gets a 10x speedup, despite being the absolute posterchild of FLOPs-limited computation.
You would expect at least 1000x, and that's probably where it would be if they didn't include JIT compile time in their time. Mandelbrot sets are a perfect example of a calculation a GPU is good at.
yeah a lot of assumptions were made that are inaccurate.
I agree that most developers are not AI developers... OP seems to be a bit out of touch with the general population and otherwise is assuming the world around them based on their own perception.
I've noticed that every time I see an article claiming that its subject is something "every developer must know", that claim is false. Maybe there are articles which contain information that everyone must know, but all I encounter is clickbait.
Understanding now hardware is used is very beneficial for programmers
Lost of programmers started with an understanding of what happens physically on the hardware when code runs and it is unfair advantage when debugging at times
> Understanding now hardware is used is very beneficial for programmers
I agree, but to say that all developers must know how AI benefits from GPUs is a different claim. One which is false. I would say most developers don't even understand how the CPU works, let alone modern CPU features like Data/Instruction Caching, SIMD instructions, and Branch prediction.
Most developers I encounter learned Javascript and make websites
And honestly, for most "AI developers" if you are training your own model these days (versus using an already trained one) - you are probably doing it wrong.
Don't worry, you'll either be an AI developer or unemployed within 5 years. This is indeed important for you, regardless if you recognize this yet or not.