Hi! I'm the director of Georgia Tech's MS in Analytics program (both on-campus and online).
GT's MS in Analytics degree is actually designed specifically for people who are going to go out and work in the analytics field -- it's not a pre-PhD degree, and our courses are targeted primarily at people who want to learn and apply analytics. We have an industry advisory board that helps us target course and program content, and we're constantly working to make sure our coursework is focused to the right cohort. We even have a required applied analytics practicum (both for on-campus and online students) where our students work on analytics projects for a wide range of companies and organizations.
Perhaps other degrees are different, but the MS Analytics is a very practice-focused degree.
I may be an educational purist, but to me I cringe when I hear universities boast about the "practicality" of their degrees they offer. You get a degree to prove you can learn. The courses should be heavy on theory and concepts. If you teach these well enough, ideally your students should be able to easily pick up whatever FOTM development stack or tool is out there and roll with it. I wish we could reverse this trend, but it just seems like it's too much good PR to say "hey everyone! come to our school and you are guaranteed to get a job!"
Certainly with the rise of sham/for-profit universities, sales pitches promoting 'practicality' now launch red flags, and deservedly so. But if the role of 'higher education' is to be a practical one (as engineering programs have always been), it only makes sense for schools to ask industry what it needs and then serve those ends, first and foremost.
In general, while theory has great value, it's more as a stepping stone to higher study than as an end unto itself. Few computing pros submit proofs among their deliverables. And devising the theta bound on a function or resolving the terms of a CSP simply don't deliver much value when working outside PhD-level R&D labs and writing peer-reviewed papers.
I believe there's a great deal of value in applied non-PhD track academic programs like GT's online discount offerings, especially in serving professionals and employers. I also believe it's high time that universities clued in to the unmet need that most of us post-academics face toward helping us continuously re-educate ourselves as we progress through our careers. Few of us pros can return to campuses, even part-time. Distance learning meets a crying need. And when done right and priced-right (as I believe GT does), I have nothing but kudos to offer in return. I say, more power to GT's authors, curators, and administrators who made this possible. And to all who make this greatly empowering service possible: thanks, and keep up the good work.
I disagree. The role of a university CS degree is to bridge the gap between high school student and software development professional. That's going to include some theory but a lot of hands-on experience with modern development tools. It should include a healthy amount of group work and tons of coding projects.
If you want to play around with theoretical computer science, get your PhD. College educations are too expensive to not be imminently practical.
I disagree with this sentiment based on my own experience. I did great in my BS CS program from a highly ranked program, but was woefully underprepared for industry and quite frankly a bad software engineer. Graduates from traditional programs often leave with next to no experience with testing, version control, team structure/process, newer languages, frameworks/3rd party packages, etc, and my experience in industry is that it's a role of the dice if your company, team, etc are interested in teaching you or waiting for you to learn. The only people I know who graduated with those skills are people who either had amazing mentors or were natural hackers in their spare time. If I could re-design my education, it would be 2-3 years of theory and then 1-2 years of applied liberal arts education before starting an actual career.
Graduates from traditional programs often leave with next to no experience with testing, version control, team structure/process, newer languages, frameworks/3rd party packages, etc, and my experience in industry is that it's a role of the dice if your company, team, etc are interested in teaching you or waiting for you to learn.
It's a waste of time to teach industry tools at a university. It's much more valuable to be taught fundamentals. Know your fundamentals well and any new tech will be much easier to learn. It's long-term thinking - put in the investment to make sure you can change skillsets in the future.
All the things you mentioned tend to be ephemeral and change a lot within a few years. Look at the git monoculture that's sprung up in the last 5 years for example - 10 years ago it might have been reasonable to teach SVN.
And if you learned SVN, you would have had a solid base for understanding GIT. Would you expect students to learn source code control in the abstract or not at all?
You have to do programming assignments anyway. Why wouldn't you require students to learn and use the latest source code control tools while they're doing their development?
Teach students to write tests, use source code control, utilize continuous integration, etc.
Although the specific tools, languages, and approaches will evolve in the coming years - none of the above are going away soon.
vi isn't very intuitive either, but there's no better way to learn something difficult than to learn it when you're young. I've been using vi professionally for 30 years thanks to my early GT classes. It's probably the single-most valuable skill that I learned there that I still use today.
I'm a vi(m) user, but I have to say - it's not a fundamental part of computing at all. It's just a very popular tool. A lot of people don't know how to use it and manage to make amazing things.
there's no better way to learn something difficult than to learn it when you're young
Hmm, define 'young'. I'm in my late 20s and I find it easier to learn new things more than ever - including things I failed to learn in my teens and early 20s. Maybe I'm just a late bloomer, and it took me a while to "learn how to learn". But maybe I'm still young in the eyes of someone who has been using vi for 30 years (:
If you know Graph theory then you know git, all that remains is just reading the man page for specific commands. Intro to Graphs/Graph Theory is generally in the curriculum at all university compsci departments
Testing ect is usually covered in all intro classes (assert libraries) or industry type testing like JUnit by a software engineering elective typically taught in Java
If you know Graph theory then you know git, all that remains is just reading the man page for specific commands.
Just because one of gits key abstractions is based on a kind of graph, I don't think it follows that knowing graph theory means you know git. I mean LISP is based on a graph structure as well but plenty of people find that confusing.
Exactly, we were checking in Java assignments in the form of a ZIP file in 2009 that we had validated with print statements. It wouldn't have been that much more work to structure the algorithm assignments in that class in a way more similar to industry workflows even if the workflows are an evolving target.
Fair enough, and I should have been a bit clearer in my original post.
In my experience with the MSCS program (nearly ten years ago at this point) the core required classes were mostly well structured and would serve people well continuing onto a PhD or growing their skill set for industry. The core constituted a relatively small chunk of the overall credits required, though, and the elective courses tended to be more along the lines of what I described.
I'm glad to hear that the Analytics program has a more dedicated focus on practical matters. It might be interesting to produce a series of similar (but narrower) curricula that amount to curated collections of CS classes making up degrees in Machine Learning, Systems Programming, etc.
I personally really enjoyed my dartboard-oriented approach to class registration. I learned more than I've never needed to know about approximation algorithms, cryptographic theory, and compilers. Even if much of what I learned there hasn't proven itself directly useful yet, I really enjoyed learning it for learning's sake, and I think I'd have had a hard time picking up some of the gems I pulled out of that since. I also still have a hobby of proving problems NP-complete on demand as a bit of a parlor trick (within the limited scope of problems for which you can apply the small handful of patterns I've burned into my brain over the years :).
I asked this question in a number of places a couple years ago and the answer is basically no.
I did, however, find that my undergraduate university had a great program for people with nearly complete degrees who had been away for a few years.
I'll be finishing undergrad this May and am now looking at grad schools. Feel free to contact me if you want to chat about this because it's been surprisingly hard to find info or advice in our situation.
Who can apply to the OMS CS degree program?
Admission into the OMS CS program will require a Bachelor of Science degree in computer science from an accredited institution, or a related Bachelor of Science degree with a possible need to take and pass remedial courses. Georgia Tech will handle the degree admissions process. For more information please visit the Georgia Tech program page.
I got into analytics while using the quant investment site Quantopian.
Mostly you use python numpy and scipy to analyze a large time series data set (stock market) to predict pricing while having a low correlation to the overall market movement.
I had some success and won their 6 month contest, but I still feel like a bit of a hack. I'd like to move into the financial quantitative analysis industry.
Would you say this GT program would be a good stepping stone?
In some ways. There's a class called ML For Trading that's very fun and like an intro to computational trading. The professor runs a company in that space.
GT's MS in Analytics degree is actually designed specifically for people who are going to go out and work in the analytics field -- it's not a pre-PhD degree, and our courses are targeted primarily at people who want to learn and apply analytics. We have an industry advisory board that helps us target course and program content, and we're constantly working to make sure our coursework is focused to the right cohort. We even have a required applied analytics practicum (both for on-campus and online students) where our students work on analytics projects for a wide range of companies and organizations.
Perhaps other degrees are different, but the MS Analytics is a very practice-focused degree.