Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The standard deviation is a poor example IMO, in many languages you can get much closer to mathematical notation.

    def stddev(x):
        avg = sum(x)/len(x)
        return sqrt(sum((xi-avg)**2 for xi in x) / len(x))

    stddev xs = let avg = sum xs / length xs
                in sqrt $ sum [(x-avg)**2 | x <- xs] / length xs


It's even a poor example of C++. Using valarray, you end up with basically the same thing as your above examples:

    #include <valarray>
    #include <iostream>
    
    double standard_dev(const std::valarray<double> &vals)
    {
    	return sqrt(pow(vals - (vals.sum() / vals.size()), 2).sum() / vals.size());
    }
    
    int main()
    {
    	std::cout << standard_dev({2, 4, 4, 4, 5, 5, 7, 8}) << '\n';
    }
…and none of those are really much less readable than the math version. All in all, that "example" clearly wasn't made in good faith, and left a bad taste in my mouth.


I think it was a poor choice of example anyway - the code solution is superior because it is far more descriptive of what is happening. The math notation requires existing knowledge which without it, you're basically screwed in attempting to understanding what the hell it does. With code you can search for terms and see what they do, or even better, have an intelligent editor provide hyperlinks to definitions.

My personal opinion is that rather than trying to make programming more like math, we should make math more like programming - such that we stop assuming the reader has some magical knowledge needed to understand it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: