I think the schools spend far too much time on symbolic differentiation and integration. This limits the exercises to the kinds of toy problem that yield to those methods. Kids get sidetracked on solving anti-differentiation puzzles, while the fundamentals are relegated to those (largely useless) puzzles.
After 20 years of engineering--in almost every case--numerical methods have been the only way forward. In hindsight, a year-and-a-half long course to convey the fundamentals seems excessive.
While numerical methods are absolutely critical in practice, analytic methods like you learn in what people call calculus and diff-eq are _absolutely_ essential to understanding the physical world.
You can't actually _understand_ numerical methods without a fairly deep grounding in analytical methods.
The real problem is here is a lack of context. Engineering and most science curriculums take a "short-cut" through mathematical education. They try to teach just enough math to get through the major coursework. As a result you end up with students who feel it's all just one big memorization trick .
> analytic methods ... are _absolutely_ essential to understanding the physical world
How so? My experience has been that the "physical world" is where the symbolic approach completely breaks down.
> Engineering and most science curriculums take a "short-cut" through mathematical education.
Only people taking more math than scientists and engineers would be mathematicians. A year-and-a-half course to cover the limit, tangent-at-a-point, functions of tangent-at-a-point, area-under-the-curve, and generalizing all of that to higher dimensions doesn't seem like much of a "short-cut" if you ask me.
That is only if your allowing yourself to take the numerical methods as a black box. If you want to be sure that your method is okay, you prove error bounds as you take limits of 0 mesh size. Otherwise it is just building tables of black magic for what schema to use when.
That's kind of where I was going with this. Instead of burning all of that time on symbolic differentiation, dig-down into numerical methods ASAP so students can get a feel for all of the related "gotchas"--of which there are many...
edit: IMHO, many of those "gotchas" are much more interesting than the fundamentals of calculus.
Consider symplectic integrators. You would never come up with them or realize the problem of energy drift if you hadn't first paid attention to the fundamentals of the geometry and calculus underlying the problem.
This is just my favorite example, but it illustrates how understanding the fundamentals also explains the gotchas. Just getting a feel for them through experience is again just black magic by building up a table of what to use when without the generalizing principle behind it.
Never said to do away with the fundamentals. Will say that most symbolic differentiation and integration (which is a big chunk of the coursework) is not fundamental as much as it is fruitless busywork.
Even so, I spent a year of my time--and God knows how much of other people's money--grinding out the mathematical equivalent of crossword puzzles so I could get my job certificate--just like every other engineer.
Use that same time to apply the fundamentals to numerical methods, and you get to go in far more interesting directions--like symplectic integrals, or chaos theory.
>Never said to do away with the fundamentals. Will say that most symbolic differentiation and integration (which is a big chunk of the coursework) is not fundamental as much as it is fruitless busywork.
It's no more busywork than being able to multiply two single digit numbers in your head. Whether it's useful to your job really depends on the job. I had a job once in the engineering industry. When we were in meetings discussing projects, if you could not do those types of analyses (e.g. asymptotic behavior of certain Calc II type integrals) in your head, you would not know what's going on. Sure, everyone could explicitly show all the steps for your benefit, but you'd be slowing everyone down.
> "physical world" is where the symbolic approach completely breaks down.
It does not "break down" it just becomes intractable in certain cases.
One needs to be able to solve problems that have all but the most essential details stripped out in order to develop a sense of how physical law actually works. Many times that is even "good enough" to get to a solution.
The best way to do that is through analytic methods, which give not only "an answer" but also tell you important features of the answer. These analytic solutions have "handles" you can use to ask "what-if" questions -- eg zero's in the denominator to indicate poles, behavior of the system as you take certain limits, geometric aspects such as symmetry, patterns in recurrence relations, etc, etc, etc..
I would posit that the reason so many people wipe-out in undergrad physics is that the coursework insists on pounding the square peg of law into the round hole of analytic methods.
Something most people in the STEM fields refuse to acknowledge is that throwing away information complicates things just as often as it simplifies them.
I say that if the ball doesn't bounce forever, the equation should reflect that.
It entirely depends on your goals. If you want to be a productive engineer, then most of that stuff is not going to be useful in your job.
OTOH, if you study physics at an advanced level, it's rather shocking how effectively all that analysis models the world, despite throwing away a lot of information. Try studying solid state physics. It's crazy the number of assumptions they make, and yet the theory still produces very accurate results.
There's a reason Eugene Wigner penned an essay with the title "The Unreasonable Effectiveness of Mathematics".
If you ask any STEM professor or TA that teaches undergrads, the reason so many students "wipe out" is because of a lack of preparation in fundamentals-- not just "calc 101", but even more basic than that, algebraic manipulation.
The material in a physics 101 course is just the barest minimum and it goes beautifully hand-in-hand with calculus 101.
Theoretical fluid dynamicist here. I solve a lot of not nice differential equations exactly or approximately.
Whether numerical methods are viewed as the primary way forward is a bit of a self-fulfilling prophecy. If you don't think analytical solutions end up being useful, you probably won't put in the work needed to generate them in the first place, so you never see the value.
Even if you go all in with numerical methods, you need to test your code. This requires an exact solution and knowledge of the convergence rate of the numerical scheme. The exact solution can be for a special case that is easy to solve. You might need multiple exact solutions to cover all the physics. You can also use techniques like the method of manufactured solutions, but if you don't like analytical methods you'd probably hate that.
You need to check if the empirical convergence rate matches the theoretical one. In practice this is rarely done, but it's essential towards eliminating bugs. So you can't entirely avoid exact solutions if you want to do purely numerics right. This was not covered in my first differential equations class, unfortunately, but I think it's an essential topic.
Exact solutions are often impossible, but less so than most people believe. I've produced exact solutions many times to equations people thought required numerics. The exact solutions are very valuable by themselves, as they can be used much faster than numerical solutions in most cases and allow you to see the structure of the solution. I think you should always try hard to make an analytical exact or approximate solution. It might be rare that you can do it, but the value is large and if we stopped teaching these methods it would become much more rare.
As for you mentioning in another post the problem of "pounding the square peg of law into the round hole of analytic methods", you should learn about approximate analytical solutions, which give you a lot more flexibility. You still ultimately have the same problem, though.
After 20 years of engineering--in almost every case--numerical methods have been the only way forward. In hindsight, a year-and-a-half long course to convey the fundamentals seems excessive.