Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Is College Obsolete? (alexkrupp.typepad.com)
8 points by Alex3917 on April 15, 2009 | hide | past | favorite | 21 comments


Ever notice how it's never the physicists, petroleum engineers, mathematicians or organic chemists making these claims? The "Formal Education Considered Harmful" meme seems to come solely from coders and the occasional entrepreneur. Which to me says something more about the state of software engineering than about college.


Almost all of the occupations you list require graduate degrees to practice professionally; two of them are primarily practiced in academia. Most of them are hard-science degrees whereas CS is at least partially a liberal art. (The exceptions to this where CS-related degrees are issued as BSs are most notably issued at the most well-regarded colleges for CS, as far as I know.)

Although the average CS program is pretty terrible, the underlying differentiating factor is that doing practical CS work is cheap and easy (the same thing that motivated internet entrepreneurship); additionally a large part of the literature is directed at 'lay practitioners'. Doing (advanced) physics or chemistry is neither cheap nor easy, and it's pretty hard (impossible, AFAIK) to do pure professional mathematics outside an academic setting.


I'd have to agree with the basic premise of the blog post.

http://learninfreedom.org/School_obsolete.html

There are a lot of institutional protections in the legal system for colleges (as sources of mandatory credentials for many occupations), and a great deal of preferential funding of colleges. But many learners can learn as much or more outside of college as in it, and as long as learners can find gainful employment without college credentials, colleges will be under pressure to justify their continued existence.

After edit:

pg's thoughts on a closely related subject:

http://paulgraham.com/credentials.html


Interesting posts. I actually wrote this over three years ago, but somehow never got around to actually publishing it on my blog. It seems kind of cliche now, but I thought it was well-written enough to be worth posting.


NO! It's thousands of intellectual and social bonding experiences happening among the minds that will shape the next two generations. The density of the network means that you can collaborators easily, and that's the benefit of college.

College might be an onion in the varnish, so to speak, but when you talk about "I went to Harvard", you're not talking about visiting the buildings, but being part of the great intellectual conversation, and that's the key point.


About 2,000 students per year are admitted to Harvard (of whom 1,600 or so enroll). There are a few other colleges that offer Harvard-level quality of intellectual conversation. Some plainly do not. At what level of college does the return on attending college for intellectual conversation not meet the cost of attendance?

And why can't people have great intellectual conversations without attending college, as some people do here on HN?


I just graduated from college in June 2008 with a BSBA in Marketing. I am now the lead UI designer for a major Rails site. Based on just that, yes, it's obsolete. I am not using my degree in any official capacity and it cost $60,000.

But while I found it easy to write off my degree and what I learned in college when I graduated as unnecessary, at least initially, I have found myself recalling lessons I learned in class more than a few times.

1. A lot of hackers and tech people look past business and marketing as the boring part of running a company. It's not, it's actually one of the hardest parts. It's easy to believe that "if you build it, they will come" but most successful software is the result of good marketing and good business sense. I learned a lot about this in school and I am glad I learned the basics where mistakes didn't cost real money.

2. While it may not have been necessary to attend school to read and discuss cases, it helped a lot. Working through marketing problems and understanding business with a group of like minded people helped it all click faster. Much like discussing technology issues on Hacker News can help make sense of everything faster.


College is not obsolete. High school is definitely obsolete. My guess is at least 8/10 people posting here had miserable high school experiences. We spend 4 valuable years marching to and from classrooms, sitting all day in a daze just to be momentarily jolted awake by an obnoxious bell. I think the high school experience has massive room for improvement (has it really changed in the last 50 years?).


Why the big fuss over college anyway? I didn't go to college, but I've always wanted to (I just kind of jumped into a career and it hasn't slowed down enough yet to do so).

People learn differently, let's just accept that college is a good environment for people who are not the most motivated or skilled self-learners.

Obviously when you're talking about the more elite and hard-sciences, it's definitley something you'd want as a hiring credential. My uncle is an engineer working in a nuclear power plant, I'm pretty sure it'd be a bad idea to hire him if he didn't have the engineering degree to show his dedication and skill (not saying you can't be that good without the credentials, but there isn't a large room for error in that position, better prepared and safe, than a radioactive toxic avenger).

Edit: It's all a personal choice in the end anyway, pretty subjective topic wouldn't you think?


[deleted]


as an educational platform it's nowhere near dead

That's an interesting point of view. To refer specifically to the example you gave, what is it about computer science that can best be learned (only be learned?) in college? Does showing that lack of college study of computer science is correlated with lack of basic programming knowledge (which I would believe, for the sake of discussion) the same as showing that lack of college study of CS will CAUSE, inevitably, lack of basic programming knowledge? Can't anyone who desires to improve programming knowledge self-educate through multiple channels in today's world?

After edit: It appears that the reply to which I am replying has now been deleted. I'll open up the questions to everyone else reading the thread.


We're constantly reminded of the people who are productive coders without a college degree so the counter-example is well established. I don't think you'll get anyone to go along with "lack of college study of CS will CAUSE, inevitably, lack of basic programming knowledge".

In the general case however, a majority of people who start college degrees (there are exceptions, again, such as MIT but those colleges put out a very small percentage of graduates) don't have programming experience. Surely you wouldn't want someone like who I was before college, just a regular guy who would chat a lot online, sometimes play video games (but not much), had written some mIRC scripts and a web page (static HTML with dreamweaver), installed linux a few times but didn't like it much, could build a computer from parts bought separately, did most of the extended family's tech support and took a VB class in high school (your average pre-college geek) to be coding the software you do your online banking with.

The great benefit of college is to show you just how much there is available to know. The field is huge. College orients you in every direction and, if you let it, guides you a few steps down the path in every way. Which path(s) you ultimately choose to follow are up to you but at least you know they're there, have a solid grounding to work on and know how to go about getting to your destination. Some people can get there on their own. Most wouldn't have a clue where to start.


The problem is that the value of certain computer science concepts, such as big-Oh notation, may not be immediately obvious for someone self-educating. They can skip over these mathematical/CS concepts in favor of more 'practical' software engineering study. You don't need to know lambda calculus or even basic sorting algorithms to write business logic apps -- and so these things get ignored.


These kinds of claims are common; G Harry Stine in his "The Hopeful Future" wrote, "A self-taught person is usually deficient in one or more areas of his learning expertise." There are several problems with the claim, but the most important is that if a self-taught person discovers he or she needs to know something that they missed earlier, they simply need to go find it out, since they have avoided becoming dependent on others to tell them what to learn and when.


I love how big-Oh notation is the default example of what you need college for in every single Reddit/HN discussion of education.


It's happened more than once that I've seen (and had to explain) why putting while loops inside while loops (once saw a one code snippet with 4 nested whiles!) is going to run fine on your dev box with 5 rows in the database but is going to be excruciatingly slow if you move it to production and data starts pouring in.

Granted, in some of the cases the person did have a college degree and should've known how big-Oh applied here but must've been asleep in class.


in some of the cases the person did have a college degree and should've known how big-Oh applied

This would serve as a counterexample to the idea that college is uniquely suitable for learning some important subjects, right?


Technically, no, it wouldn't. Not as you stated it. College being uniquely suitable for learning a subject does not ensure that someone will learn it if they go to college.

You can say that to learn x you must go to college. If you went to college though the fact that you did or did not learn x says nothing about whether you need to go to college or not to learn it.

You get drilled with exactly this type of logic problem a LOT in discrete math courses. Stuff there's no way the extreme vast majority of working programmers would study on their own just because.

That's being pedantic though, I imagine you're saying that it doesn't matter whether you go to college or not since you can make the same kind of mistakes when you get out. The latter part of which is true. But there's no doubt in my mind that had the guy that did go to college would be a lot worse had he not gone. He'd most probably not be able to write code that worked at all rather than code that works but not as optimally as it could.


Agreeing with you that many "theoretical" topics are more practically important than learners of many domains realize at first, why should I be convinced that college is a uniquely good environment for thoroughly learning important subjects such as lambda calculus or analysis of algorithms?


"Things got bad when credentialism surpassed education as the primary function of college. If you want evidence of this, consider the criteria for the US News & World Report college rankings: Peer assessment, student selectivity, faculty resources, graduation and retention rate, financial resources, alumni giving, and graduation rate performance. Notice anything missing?

Are colleges not ranked in order of how much students learn because that would be impossible to measure? Or is it because no one cares?"

if you think about it, this doesn't make sense. the us news college rankings are used by high school students and their families to select colleges to attend, not by (most, i assert) employers to select employees to hire. they list credentials of colleges, not college graduates. so they are not evidence that "credentialism" is a function of college at all.

employers care about the difference between schools in regard to what their students learn---and these differences surely exist---and they assess schools with this criterion using empirical data they gather from interviewing and hiring people.


"the us news college rankings are used by high school students and their families to select colleges to attend"

That was my argument. Students don't choose their college based on how much they think they will learn. And so colleges have no incentive to make sure they do.

No one cares if students aren't learning, not the students, not the colleges, because learning is no longer the point.


i don't believe you have shown that how much students think they will learn doesn't play a role in their college choices. they choose based on the information they have available; the criteria you mentioned contribute to an idea of how effectively schools can teach students. alumni giving/financial resources, for example, allow schools to buy better faculty and equipment, which make students learn better. graduation and retention rates are evidence of a decent educational environment.

sorry to directly contradict you on this, but i also don't believe this was your original point. you were trying to show that a degree serves mainly as a credential. students may select schools on the basis of their rankings, but it doesn't follow that the rankings are the motivation behind their selections. to use an analogy, i may use reviewer ratings to select a computer to purchase; it doesn't follow that i select brands _for_ the prestige of owning a computer rated 5 stars by cnet.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: