My first semester at uni doing a CS/math degree I had a class called introduction to unix which taught everybody bash/awk/sed scripting and how to use the CLI to do stuff on Linux. It was great.
It also had a really high failure rate, like 60%ish.
> It also had a really high failure rate, like 60%ish.
I'm always surprised by how many people struggle with using basic Linux CLI utilities and navigating common consumer UI interfaces. Moreso when they're great at understanding low-level programming languages and concepts, something I struggle at heavily.
I always assumed IT/cyber was CS-lite for brainlets, that the CS guys could do everything we could and more. I suppose in actuality it's simply having different talents and interests.
IT is easier than CS, but it's a huge amount of knowledge. It just takes the to learn. Enormous accounts of time. Even more practice to be fluent and fluid.
- Computer architecture (how do you make a syscall to the kernel? How do page faults happen? Branch prediction? Plenty of recent major vulnerabilities involving stuff like this)
- OS architecture - blends in with the above - how does process scheduling work?
- Networking protocols/algorithms and the underlying queueing theory (eg how does BGP work, what are its failure modes, what's the math behind TCP congestion control, exponential backoff, complex multicast architecture, blah blah blah. Networking can be genuinely hard if you push it hard (and you often do, when you are trying to make money)
Traditionally these are harder to learn since they were harder to encapsulate and virtualize. Unlike "platonic" coding, these were traditionally hard to get experience outside of a real job situation (And even if you can simulate it, there's a lot less motive for a hobbyist to set up a simulated anycast network than to write some cool web app or desktop tool) leading to a chicken and egg problem
> Moreso when they're great at understanding low-level programming languages and concepts
This can likewise happen with high-level programming languages and concepts, too!
A childhood friend of mine, now a mathematician, has been a brilliant programmer since we were kids, but never really a techie. Lately we're working through a series of functional programming textbooks together, and he's great at explaining foundations and relating the informal explanations of the book to formal type theory and category theory. (He has no trouble with the coding exercises, either, or in reasoning through the code of others in the book club.) But he's just more interested in math than in computers per se, so even though nowadays he works on software every day and is highly technical compared to the average person, there is still relevant intuition and knowledge, common among computer hobbyists, which he's missing because he hasn't put that time in.
Personally I see no surprise: classic systems was designed with a PRODUCING user in mind, the user at the center, the system offer anything at his/her disposal. New systems are designed with CONSUMING users in mind, the system "drive" the user who can only click around and "follow the UI".
Producing many things is not that hard, but people mentally trained to be just "followers" from the start, can't became producer easily. To switch the mental model you need YEARS not a single course.
The centers of this issues are two:
- IT it's not CS, like astronomy study the night sky (IT) not the telescopes (CS), those who really know also understand the power of IT in a society, the power of INFORMATION not the mere tool used to master it, so they do not like much to spread such knowledge. Just imaging a society where common people read news via a persona feed reader with collected history instead of some website/aggregator headlines: such people will spot easily most PR campaign and instead of being driven by the "perceived crowd emotions" are driven by "knowledge";
- those who sell IT services knows that most of their revenues are done on user ignorance, if you start having technicians formed to produce you have people who know and build on their own, not useful idiots driven by managers [1] and as a result systems to produce things instead of consume will spread and users will be educated a bit so their big business model will fall.
IMVHO that's why from the Xerox model we (as a society) switched to the IBM model, Microsoft model, GAFAM model: political and economical reasons of some giants against the society. They succeed and the outcome is a more and more ignorant society to the point we can't keep up. China success, doing the very same things we have done in the past, when we was leaders, prove that well. Many still have not seen the fall, but here and there you can see some part of it.
Ha, this is great. I actually did an independent study course my senior year of college, where I went around to the freshman and sophomore level classes and took over the course for a day, to teach relevant tooling and such, much like this. I had been working as a software developer for a small company since I started learning programming, so I had figured a lot of this out the hard way on my own, and was sort of pissed they didn’t teach this kind of stuff, so I would have absolutely loved a crash course like this. Kudos to the folks putting this up.
> The class is being run during MIT’s “Independent Activities Period” in January 2020 — a one-month semester that features shorter student-run classes.
Many students go home during this winter "semester", as Boston is cold and snowy, but the learning and bonding during this time can be invaluable.
It also had a really high failure rate, like 60%ish.