No, it's not. It depends a bit on your teachers but also on which "level" of math you're in (in the US). Lower level but still algebra/geometry classes tend to teach facts, not derivations from foundational concepts. Those are the classes aimed at non-Honors and maybe non-College Prep students (2 of the 3 typical "tracks" students end up in the US, names may vary by state and decade).
The way Geometry is taught in the US is awful. Instead of learning that you can use shapes to do useful calculations like square roots, you slog through postulates and theorums without any sense of why you have to do them. Rarely is what is learnt in geometry ever used in later high school courses, save for trigonometry. I hope it is different in other countries.
The US education system varies tremendously. I don't think you can credibly claim that this teaching method is "standard" in the US. At least my experience was much better than you described. Indeed it wasn't until university-level mathematics that I started to get into the "stop trying to understand how they work and just memorize these formulae". Fortunately my engineering classes provided a through-line for understanding how those formulae worked, and I was much stronger for it than my math-major contemporaries.
The best math class I ever had was a drafting class called "Descriptive Geometry". In that college class we used a drafting table to solve math problems.
An easy example would be the length of line of the corner seam of a hip roof. Sadly I have forgotten what the more complex problems were. Fantastic class that isn't offered anymore.
> Instead of learning that you can use shapes to do useful calculations like square roots
Huh?
You can easily draw shapes that conceptually represent square roots, but how do you get from that to calculating the square root? You'd need an infinitely-graded ruler.
Can you perfectly calculate 1/6? sqrt(2) has a similar exact representation as a periodic "sequence"; its continued fraction is [1;2,2,2,2,2,...].
That's of limited use if you're trying to measure something, but in that case, conceptual representations are out (where the symbol √ and a picture of a square are equally valid), and you're choosing between a geometric approach where your accuracy is limited by the quality of your tools, or a symbolic approach where your accuracy is limited by how much accuracy you want.
Continued fractions require and infinite number of operations to be worked out, while periodic decimals are both finite in different bases and do not require any operations to decode.
Continued fractions require an infinite number of operations to be worked out if you want to be exact, but that's also true of periodic decimals. You can terminate a continued fraction anywhere and get a best rational approximation.
You can refine your real-world realization of a geometric construction by eg executing it bigger and bigger. (Or doing something more clever about the errors.)
Similarly, you can keep working on your calculation of sqrt(2) as a decimal number, and keep adding digits.
I don't think so. I was on the Calculus track in high school so we derived it in...Pre-Calculus.
Prior to that the quadratic formula was something that seemed to be handed down from on high. We used it in Algebra II and maybe even before that, but I had no idea where it came from.
It was a mind-opening experience when we derived it in class one day. Our teacher didn't ruin the surprise. She just said, let's complete the square on a general quadratic equation. And there it was. The quadratic formula!
> Our teacher didn't ruin the surprise. She just said, let's complete the square on a general quadratic equation.
How is this not ruining the surprise? The only possible outcomes of doing that are that (1) you make a mistake; or (2) you get a formula for solving quadratic equations. Quadratic equations have the same solutions regardless of your methodology, so there's only one formula you can get.
I can tell you how great my math education was. I don't even recall the quadratic formula or completing the square is. I'm almost certain they were part of the curriculum at some point.
I recall in later math course, notably calculus, the teacher assumed we we're familiar with some concept because we we're supposedly taught it the prior year, yet not a single student in the class could recall it having been taught before.
I distinctly remember "accidentally" deriving it when I forgot the formula on a test; perhaps my proudest math moment (though really I was just scrambling apply any rule I could think of that got me closer to the vague form I remembered)
And then I got really annoyed that no one ever told me to do that before and started discounting teachers for years onwards.. probably to my own detriment.