Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In a roundabout way, I wonder does this one fit what you're after:

https://bogart.openmathbooks.org/ctgd/ctgd.html

And more directly, a quick browse showed up a book called:

"Mathematical Notation: A Guide for Engineers and Scientists" which looks like it addresses your issue directly.



The issue is that I dont want to explicily learn all of the notation but step by step, topic related with usecases in the real world...


Could you give a concrete example concerning what sort of notation caused you difficulty in the past? Asking because it seems odd to me that you feel you need to learn „all“ the notation to get started.

Starting in elementary school you slowly build up topics, mathematical intuition and notation more or less in unity. E.g. starting with whole numbers, plus and minus signs before multiplication, then fractions and decimal notation. By the end of high school you may have reached integrals and matrices to work with concepts from calculus and linear algebra…

It makes little sense to confront people with notation before the corresponding concepts are being taught. So it feels like you may have a different perspective on notation as a layperson that are no longer obvious to more advanced learners.


Set theory comes to my mind as an example. Somewhat understood the notation but books increase the pace so much


Wait let me make sure I understand, you want to skip the notation all together, or you want more support in understanding it?


I want to learn the notation. Just not everything at once. I need to be able to see real world usecases, otherwise I wont be able to remember and apply the notation. What I meant is learning the notation step by step, topic related.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: