I disagree about the ML Class...I'm getting a lot more out of the approach given my limited time budget than when I was trying to follow along the 2008-era lectures which were very proof-heavy. (Those are still out there if you feel the need for something deeper.)
Why? Because the emphasis seems to be on mastering the material and the intuition behind what Ng is teaching.
Plus, I can't say enough about the way in which they've managed to improve the virtual learning experience:
* Video lectures broken into discrete 5 to 14 minute units with interactive review questions built in, with annotated slides using Khan-like software.
* Slides reflecting the annotated slides.
* Adaptive review questions with some answer randomization and time delays to prevent random guessing, along with the encouragement to retake these again as review (with your highest score counted).
* Programming exercises that are manageable (and which scale to the 60K or so odd participants in the class).
* Consistent use and explanation of terminology
* Great discussion forums.
The net effect is that you are strongly encouraged to go at a pace that ensures mastery of the topics and fundamentals. Yes, the first couple of weeks were designed to refresh people on Linear Algebra, how to use Octave, and learning the programming exercise submission process. I have been incredibly impressed at the amount of design work that has gone into this class experience, and the depth of understanding that I've come away with compared to trying to barrel through the original recorded CS229 lectures focused almost exclusively on proofs. I'm glad I went through those lectures, but this is so much more of a superior approach to learning the fundamentals of machine learning that I wish every class I took was created with as much care. There's a model here for a way to scale learning of complex topics.
If you want more challenge, I would suggest working through the original CS229 materials. It focuses much more on the proofs, and there are some practical examples there as well related to debugging ML applications. I believe that as this course progresses, he'll introduce more info. As he said after the second or third week of lectures, a lot of what "real world" ML people are doing in industry is centered around what the first 3-4 weeks covered--linear regression and logistic regression, and I see a lot of applications of what I've learned to date in my daily work.
Why? Because the emphasis seems to be on mastering the material and the intuition behind what Ng is teaching.
Plus, I can't say enough about the way in which they've managed to improve the virtual learning experience:
* Video lectures broken into discrete 5 to 14 minute units with interactive review questions built in, with annotated slides using Khan-like software. * Slides reflecting the annotated slides. * Adaptive review questions with some answer randomization and time delays to prevent random guessing, along with the encouragement to retake these again as review (with your highest score counted). * Programming exercises that are manageable (and which scale to the 60K or so odd participants in the class). * Consistent use and explanation of terminology * Great discussion forums.
The net effect is that you are strongly encouraged to go at a pace that ensures mastery of the topics and fundamentals. Yes, the first couple of weeks were designed to refresh people on Linear Algebra, how to use Octave, and learning the programming exercise submission process. I have been incredibly impressed at the amount of design work that has gone into this class experience, and the depth of understanding that I've come away with compared to trying to barrel through the original recorded CS229 lectures focused almost exclusively on proofs. I'm glad I went through those lectures, but this is so much more of a superior approach to learning the fundamentals of machine learning that I wish every class I took was created with as much care. There's a model here for a way to scale learning of complex topics.
If you want more challenge, I would suggest working through the original CS229 materials. It focuses much more on the proofs, and there are some practical examples there as well related to debugging ML applications. I believe that as this course progresses, he'll introduce more info. As he said after the second or third week of lectures, a lot of what "real world" ML people are doing in industry is centered around what the first 3-4 weeks covered--linear regression and logistic regression, and I see a lot of applications of what I've learned to date in my daily work.