Hacker News new | past | comments | ask | show | jobs | submit login

I hit a wall in college where math just stopped being something I intuitively "got". I'm sure that given enough time and motivation, I could have continued being "good at math" even at the higher levels, but these were luxuries I did not have, given everything else on my plate at the time.

My biggest problem with math, especially once I got into academia, was how it was taught. So many professors would scribble what seemed like nonsense on the board (symbols that change from professor to professor, or even from lecture to lecture) and then go on to say things like "...and the proof is trivial" or "...it obviously follows that...", and I'd sit there wanting to shout "NO, NO it's not obvious!"

Finally, I'd find a tutor to explain to me what it was I was missing, and it really WAS obvious. If only it had been taught that way in the first place!

Admittedly, not everyone has the same learning style, but the classes I took seemed really tailored towards the students who already had the intuition that I lacked.




I agree, college mathematics education is usually horrible, especially in the start. I remember back when I was attempting to read my calculus 1 textbook and not understanding the thing at all. I realized that about 3 or 4 classes later in discrete math the reason why is because it was using fairly foundational mathematical terms and concepts such as sets, proof by induction and so on that it was small wonder why my calculus textbook was incomprehensible to my grade 12 mathematical education.

I confronted a professor about why they have it backwards and don't teach discrete math course or similar foundational course FIRST so people can actually read their textbooks, or at least let people take that path. They basically said that since it's not relevant to many majors and it's harder for most people since they don't have the 'mathematical mind' they do it in that backwards way. The professors being Math PhDs, don't adjust their any of their classes enough for the lack of foundational knowledge. It frustrated me very much. I don't think it's a big mystery why you probably hit that college wall when most college curriculums are set up that way.


I think this is due to the fact that many many years ago logic used to be taught in highschool and all of those concepts would have seemed significantly less foreign.


You have hit upon my biggest gripe with mathematicians, their love of 'notation', or more precisely their love of writing in symbology that makes mathematics seem more arcane than it actually is.

When ever I've run up against "impenetrable" math I often ask "So how would you use this?" and connecting it to the real world helped tremendously.


Mathematical notation is a must. Math in "plain English" would be as big a nightmare as programming in plain English. Natural language is so ambiguous that your "plain English" would turn into legalese every time you had to explain something unambiguously to someone who didn't already know it.

The problem isn't the notation per se, it's that teachers don't spend nearly enough time explaining the notation itself. It's a foreign language that they are so skilled at that they don't understand how unfamiliar it is to their students.

I was well into a physics major before I really stopped and carefully considered the many different notations used to represent derivatives (dx/dy, f'(x), y', y-dot, Dsub-yX, etc.) I realized that I had developed separate context-specific bodies of calculus knowledge/skill from different fields with different notations and approaches, and that these were all the same thing. Different techniques in different contexts were an artifact of different styles of notation, not differences in the actual math.

I can understand Italian to some extent as a side-effect of my study of Spanish. If I took an electronics class in Italian, I would understand some of the concepts and misunderstand others. Would my troubles to understand certain electronics concepts be due to trouble with electronics or trouble with Italian? Who knows? Both types of misunderstanding would compound each other.

If teachers spent more time carefully teaching this foreign math language before (and while, and after) using it to teach math, I'd guess a lot of students' "math problems" would magically disappear.


The problem isn't with using notation. Lots of people on HN are programmers, we understand the value and necessity of unambiguous artificial languages.

The first problem is that the notation is usually "the first symbol that popped into some random genius's head 200 years ago". And once the notation is set, it's set, no matter how poor it is, or how many other places it's already in use etc etc. Then, as you note, sometimes there are multiple notations. Ugh.

The second problem is closely related to the first. Mathematical notation is write-optimised. This makes sense because of the long history. But that doesn't change that write-optimised languages are harder to read and understand, even for experts, than read-optimised languages.

In a programming context if you today reduce all your variable names to single latin letters and all your function names to single greek letters, you will be widely mocked and reviled. In maths it's Just How Things Are Done.

I guess what I'm saying is: the curse of mathematical notation is pen and paper. The boundaries of QWERTY liberated (almost all) programming languages from the curse.


Good maths proofs proceed step by step, with intermediate goals written in clear English (or French, for me). And to go from step to step, you don't need explicitly long names, just like you don't name your index loops "index_over_collection" but just i. Because your function will only be a few lines long.

I actually really like mathematical notation. Once you get used to it, the terseness can make things a lot clearer than natural language.

(If you've ever tried to read old mathematical articles / books that are a few centuries old, you'll understand the power of mathematical notation. Check out "God Created The Integers" or "On the Shoulders Of Giants" by Stephen Hawking if you're curious: these are two books that provide excerpts of highly influential works from earlier mathematicians and physicists, respectively)


> In a programming context if you today reduce all your variable names to single latin letters and all your function names to single greek letters, you will be widely mocked and reviled.

Serious question. What do you think is the general impression of APL programmers? =)


I thought about mentioning APL, but sometimes the correctness of an argument is best served by inexactitude.

I already compromised by adding "today" and "almost all" as qualifiers.


What would you use other than symbols? Describing everything with words? For example:

When given two numbers, if one wishes to find the quantities that give 0 when the second of the numbers is added to the product of the quantity and the first number and the quantity multiplied by itself, one should negate the first number and then either add or subtract the square root of the sum of the square of the first number minus four times the second number, and divide this summation by two.

That's just the quadratic formula in disguise:

Let b and c be real [or complex] numbers, then

   x^2 + b x + c = 0
implies

   x = (-b +- sqrt(b^2 - 4c)) / 2
Clearly the latter is easier to understand and digest; the same holds for higher mathematics. In fact, there are multiple interpretations of the text (admittedly I just wrote that now, and I'm not the best writer), while the symbolic mathematics itself is essentially entirely non-ambiguous (given some background in symbolic algebra).

(Saying "let's do maths without the symbols" is a little like saying "let's do programming without special languages"... it is very very hard to make it work.)


Ah yes - I used to love reading Euclid's proofs [1]. Wonderful descriptions. He used such poetic phrases too - things like describing a line as 'a length without breadth.'

As wonderful as it is, mathematics needs notation, and lots of it. You can express incredibly complex ideas in mathematics, totally unambiguously, through a collection of symbols. Not to mention that they're universally recognised.

The reality is that mathematics is 100% about thought. You'll struggle to put together the concepts in your mind long before the notation is the real issue. Once you have a clear picture of the abstract space you can use the notation you've learned to communicate the world you've created to others. What could be more wonderful?

[1] http://aleph0.clarku.edu/~djoyce/java/elements/bookVI/propVI...


Agreed. Plus, the current notation is a result of multiple iterations by some brilliant people. If you look at older attempts at formal mathematical notation, some of it is laughably bad in comparison. We are standing on the shoulder of giants :)


Pictures are worth a thousand words! Now let me put the quadratic formula in disguise as well.

Suppose we wish to make a rectangle with a given area and perimeter. This is an interesting problem! Does the number of possible answers depend on the specific area and perimeter? Certainly! It all comes down to thinking about squares, since squares maximize the area given a fixed perimeter. If the area of a square with the given perimeter is LESS than the desired area, then there's no way we can make such a rectangle. If the area of the square is equal to the desired area, then that's our only answer! Now, how about if our square's area is larger than the desired area? We'll get two possible different lengths of a given side of the rectangle - one representing the rectangle's width, and the other representing it's height. Or we could also think of them as two different rectangles - a tall one, and its rotation by a quarter turn (which makes it wide). By symmetry, we know that the difference between the square's side length and the shorter side will be the same as the difference between the square's side length and the longer side. How large is that difference? Exactly enough to diminish our shape's area from the square's area to the desired area. And that difference in length is simply the square root of the difference between the square's area and the desired area!

It would have been better with pictures :). Anyway, the quadratic formula is probably the greatest mistake in all of mathematics education. Somehow we use the word "quadratic" and even the phrase "complete the square," but never have I ever seen someone draw the said square!

While I think notation is often great for expressing ideas concisely and precisely, I think an excess of notation not a good way to communicate concepts. Nobody should memorize the quadratic formula! We should understand instead how to think about areas and lengths, and then we solve the problems in quadrature that we want.


I agree with you, but not everything can be done pictorially.

FWIW, an animation of the quadratic formula/completing the square: http://en.wikipedia.org/wiki/File:Completing_the_square.gif


With more notation, some readability is lost again:

Let x, b, c \epsilon C:

x^2 + b x + c = 0 => x = (-b +- sqrt(b^2 - 4c)) / 2


Well, this is highly tangential, but if you're talking about LaTeX formatting, \epsilon is not generally what you want for the "element of" symbol. Instead, use \in. The biggest difference is that \in is a binary operator and has the appropriate spacing.

Example of the difference: http://i.imgur.com/gwAqirx.png

generated by code:

  \textbackslash{}epsilon: $\epsilon$ \\ 
  Example: $x \epsilon \mathbb{R}$
  
  \textbackslash{}in: $\in$ \\ 
  Example: $x \in \mathbb{R}$
If you use \mathbin{\epsilon} instead, you'll get proper binary operator spacing, but you'll still get odd looks from people who are accustomed to \in. Admittedly, the symbol did historically begin as an epsilon, but that notation died off a while ago.


Notation is not a bad thing. What is bad is an inconsistent, ethereal form of notation.

Computer science is also essentially about notation and vocabulary, but we have to make our notation understandable to the computer, which is a much higher standard than what mathematicians have to adhere to.

We are in a field that demands a much higher level of rigor than mathematicians are accustomed to, as much as they'd hate to hear it.


Mathematicians have a relatively fluid relationship with notation. Since higher math is constantly introducing new abstractions, or applying new techniques to old abstractions, new notation is introduced, or old notation reused in a slightly different way in a large percent of papers that are influential. I think mathematicians would happily admit that.

That said, it does bring me back to a math class where a professor, after realizing he needed to introduce a subscript, to a subscript, to a subscript of something that had both a subscript and superscript already, made a comment along the lines of "please excuse my poor notation."


I certainly agree with this that it is inconsistency. With a straight up notation handbook and agreed upon "language" I would have been happy to get over the hump once, maybe twice, but when I spend 15 minutes re-assuring yourself that the notation in this proof/formula/paper is just a variation of another equivalent notation, it makes me irritated and that interferes with my appreciation of the concepts being presented. Do it enough and I just throw the paper out.


There was some point in my education where I was taking three classes, each with their own definition of phi. It drove me nuts.

Coq notation, lisp-style notation, even python-style notation- anything would be better.


I remember taking three classes: informal logic (philosophy) which ended up talking about formal logic anyway, electrical engineering, and a math class. It make the logic class really easy, already knowing it from EE. They all had different notation for 'implies', 'not', 'and', 'or', etc. You could argue that the computer languages have a 4th notation, but I won't.


You're confusing pedantry with rigor. The fact that a compiler will complain at you if you misspell 'continue' but a mathematical proof will keep going just fine doesn't mean that programming is more 'rigorous' than mathematics.


I'll disagree that it's the use or non-use of notation. I had the great joy of taking an Intro to (mathematical) Logic class with Dr. Richard Vesley (who had Kleene as his Ph.D. advisor!) By that point, I'd had many math classes with instructors all over the bell curve.

Dr. Vesley blew them all away. There was notation, but the real clincher was his absolute clarity of communication. He covered a lot of material, but the pace never felt rushed. In fact, it was so calm and so clear it was refreshing, more like meditation by a babbling brook. I wasn't the only one to feel that way -- the whole class seemed to have a similar experience.

Related to the story in TFA, a friend of mine with a towering math background said a few years ago, "I remember when math was easy -- back when I had time." Math that we've learned and mastered is "easy", but new areas of math can require a LOT of mental energy to gain traction in.


University math departments really should reconsider their approach to real world applications. Most math professors don't seem to consider it a part of the curriculum, and that can be really detrimental.

My largest college regret was blowing off linear algebra- it was an annoying class taught in an annoying way (handwritten homework showing your work for each step of matrix multiplication, no proofs, ect). I blew it off because there were no applications of it in anything I cared about.

A semester later, it showed up somewhere in every single advanced computer science class. Really wish there had been a proof-based linear algebra class that showed up later in the curriculum so by the time we reached it we knew it had value.


I think you need to go through that with linear algebra, though. There's a stage early on where you just have to multiply matrix after matrix until it's second nature. Shame your course wasn't taught in a compelling way and it put you off. Having said that though - I had no idea how useful linear algebra was until much later in life.


Same exact story here. Linear algebra seemed like a tautology, I was ranting about it until I started using it in CS...


There are plenty of worthwhile subjects in Math which have no concrete application to the real world. One needs to be able to understand these abstract ideas for what they are, not merely formality surrounding a simple real life phenomenon.


Yeah I totally agree. I took one of the car programming classes online and found that while the instructor was (obviously) really smart, he sucked at programming. He tried to write out mathematical equations in python rather than structure the code in a way that simply described what was going on. At the end of each mini lesson I'd refactor his code so that it made sense (mostly doing small stuff like changing variable names from their corresponding mathematical symbols to the word of what they actually were, or factoring blocks into methods). Eventually I gave up not because I couldn't understand the domain of self driving cars (he was great at explaining that stuff), but because I couldn't keep up with the mathematical syntax (and the online course kept wiping my code and resetting with his, which was extremely frustrating).


If you do not learn the mathematical structure it is going to be hard to do anything further in the field after the course. Why not just bite the bullet and properly learn the pre-reqs?

With a basic course in linear algebra (such as Gilbert Strang's on MIT OpenCourseWare) and potentially some intro calculus you should fly through that course.


Are you talking about Sebastian Thrun's AI class on Udacity [1]? I haven't yet taken it, but I have on my todo list.

[1] https://www.udacity.com/course/cs373


Yes, he is.


Reminds me of my friend who once said. "I understand calculus but whenever I see the little snake (integral symbol) I don't know what to do"

But following up on your comment, the difficulty of math is that it is build on foundations. If you miss something because you were distracted, that hole is going to be an impediment over and over and it will create more holes until it is very difficult to make progress.


Structure and Interpretation of Classical Mechanics by Sussman and Wisdom explores some of the issues surrounding mathematical notation. Their primary thesis seems to be that by using uniform notation, s-expressions in this case, we can better understand and reason about mathematical concepts than what using standard math notation permits.


> Sussman explores some of the issues surrounding mathematical notation

see this video, starting at 8mins: http://www.infoq.com/presentations/Expression-of-Ideas


> and I'd sit there wanting to shout "NO, NO it's not obvious!"

Toward the end of my degree program, I became the annoying guy in class who would do exactly that. I remember one time in particular (I think the topic was something on wavelets, which I barely remember now anyway) when I stopped the professor and said "Can you explain all of that over again, from the beginning?"

Worked great for me, but I'm not sure what the rest of the class thought of it. At the time I just assumed they were as lost as me and would appreciate it, but that may well have not been the case.


I bet a lot of them were lost. If I were in your class, I probably would not have been bothered by your question.

The questions that bother me are the type that stroke the ego of the person asking because they already know the answer.


Good for you. Didn't work for me at university. We had one lecturer who was great and would adjust his material on his OHP transparencies if questions were asked and explain in detail. Unfortunately questioning others or asking for clarification would result in either being summonsed by your tutor and getting a bollocking or being asked to leave instantly for "not reading the material".

I quit after the first year. Best thing I ever did.


>> wanting to shout "NO, NO it's not obvious!"

Usually "obviously", "trivial" and such are used to point out that: "this should be obvious/trivial by now", if not: you are getting behind/need to study more/be better prepared before class/....


That is how I read these signals, and sometimes it was true. I'd study a bit more, meet with the class's TA or a university-provided tutor to go over the material, and I'd be fine.

But even then, in many cases, the professor was simply expecting us to have made an intuitive leap. And those of us who hadn't made it were left behind, with no explanation as to why or what it was we needed to understand.


you know you've stepped into the woods when you stop hearing "this is obvious" in math classes...


I was an art major in a mostly engineering school. I always had trouble with math but loved learning it. I also hit a wall in college like you. The first day of class the teacher asked how many engineering students there were in the class. Just about everyone raised their hand. This was her sign that she could teach fast. I knew I was screwed immediately. I wish she had asked how many art students there were so we could go slower.

I remember the math stopped making sense. The teacher would do exactly as you described, saying things like "...it obviously follows that...", etc. A girl who sat next to me would try to explain but was no less clear than the teacher. All the engineering students just got it.

My grades started high and then rapidly fell each week until I hit a string of zeros for a month. I was too proud to ask for a drop but eventually did but only after skipping a month of classes. The teacher was kind enough to understand that I was trying but my effort was for naught. She gave me the drop.

I've never pursued math any further, having felt defeated.


I think that's part of what the author is trying to approach in the article. I could be wrong, but the engineering students were probably immersed in math way more than you were, so concepts that seemed natural to them were only because they'd banged their head against it more often.

I run into this a lot when people talk to me about programming and I get something faster than they do. I've spent a lot of time reading book after book, listening to podcasts, learning new languages, and studying new concepts, so it can be really easy for me to fit new information or ideas into some context and get them. I don't think that I'm necessarily smarter for it. I think I've just had a passion for it, so I never get tired of reading the articles and absorbing the material.

Now I'm personally running into the place where there are a lot of things I've wanted to learn for awhile, but my weak background in math hinders me (machine learning, more advanced algorithm analysis, signal processing, machine vision, etc).

I'm working my way through a calc book. I don't think I could have done it in your position, either though. I'd have psyched myself out. I've got to learn it on my on, with my own rhythm.

Kind of rambly, but you should find something related to what you like and maybe jump back in. Just find a learning mechanism that's suited to your background :)


i got As in almost all my classes in math without really understanding anything. It was like a train I couldn't stop. I couldn't ever slow down for a month and say "I'm going to work on trigonometry this whole month so I can know what I'm doing when I do integration." I just had to memorize the steps to get the right answer, get my A, forget it all, and wait for whatever was coming next.

Years later I go straight back to the beginning and figure everything out, starting with Serge Lang's Basic Mathematics. I didn't even know where the Pythagorean Theorem came from, and when I learned it the second time around, it was damn beautiful.

my advice is to go back all the way to the beginning and get a book written by a real mathematician. I.M. Gelfand's Algebra and Trigonometry were truly enlightening.


Thanks for the book suggestion. I too, would like to go through and replace the "method" regions with "conceptual" understanding


In that vein, may I also suggest Ordinary Differential Equations by V.I. Arnol'd.


> So many professors would [...] say things like "...and the proof is trivial" or "...it obviously follows that...", and I'd sit there wanting to shout "NO, NO it's not obvious!"

Agreed. What makes a great teacher is not the depth of their knowledge of the subject matter, but how well they are able to put themselves in their students' shoes and overcome the "Curse of Knowledge"[1]. Unfortunately for students, university professors are usually hired for the former (knowledge & research) and not the latter (real teaching ability).

[1] http://en.wikipedia.org/wiki/Curse_of_knowledge


I really felt college level math (abstract algebra, groups, monoids and such) was abstract painting in hieroglyphics, until I made a full turn into programming where you start to speak about abstract patterns that makes absolutely no sense (iterators, monads) at first, then I felt that this way of thinking seemed to be about finding your own solutions by being "mathematical" in the way you model your problem. Then abstract algebra started to feel like 'math + generics' and I went into re-reading my college textbooks. I still don't understand more than 15%, but I feel I have a chance.

I agree about the way its taught, teachers either forgot their own learning process, or they're all very advanced brains aiming at younger advanced brains that can unfold the possible application behind the abstractions.


I know that my personal experience won't be of much of help to lots of HN-ers who are passed the highschool - early college-age, but I can say that in my case my love and understanding for Maths was inspired by two great teachers that I had, my highschool Maths teacher and my Calculus prof in my first year in college.

Maths is not about arithmetic computations or getting the "exact" answer, is realizing that things like convergent series or Real numbers are extraordinary things, almost magical, as in you somehow get the sense they all come from a different Universe. Sometimes one is lucky enough to have these things revealed to him, like it happened for me.


This resonates with me- particularly the 'it obviously follows', (huge jump in complexity and then --->) 'so we already know that x, so obviously- y'.

I think a lot of teachers aren't cognizant of the fact that what is obvious to then isn't automatically obvious to students. I had what I would say is at best a mediocre HS math teacher and completely tuned out. It was my fault in the end, but the teacher didn't help.

I'm now trying to learn a bunch of things that require math and mathematic theory when you get to higher levels. So - learning math is what I have to do. It's kind of fun, and yes it is hard work. Like, brain-hurting hard work.


I usually just raise my hand and ask the professor to elaborate. If it's genuinely time intensive, they'd ask me to stop by during office hours.


> I hit a wall in college where math just stopped being something I intuitively "got".

That's the college experience. Everyone that goes to that school is smart. Since the professors go to this level, all that remains is hard work. They understand that everyone there is smart.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: