Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah its one of those handy hacks. I like these because they allow changing the input data to be a substitute for a different algorithm/formulation. Transforming the input is often easier to do under a deadline than deploy a well tested new algorithm.

All that said, the typical regularized version of linear regression with the 'append 1' trick is no longer equivalent to the affine version one may have in mind. The difference is the weight that corresponds to the appended dimension would be regularized by a typical implementation of a regularized linear regression. Unless, of course, special care is taken to remove regularization on the appended dimension.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: