C is statically typed, but I can still attempt to dereference a null pointer. Static typing doesn't save me here, nor does the compiler, as it's possible for these issues to happen at runtime.
This may be something that Haskell doesn't allow, but it's not something inherent to static-typing.
Intercal is dynamically typed, but it doesn't save me any development time or make my code any shorter! (Well, maybe compared to Java :)).
On top of this, the C type system is not really about correctness at all. My understanding is that it primarily helps with performance, memory management (e.g. you know the size of stuff) and not accidentally using a non-pointer as a pointer. I'm not a C person, but C does not give off a vibe of caring about correctness.
In fact, C is particularly unsafe: you can get all sorts of fun things like bus errors and segfaults that are basically impossible in other languages. C definitely has a place, but only if correctness is much less important than performance.
You can ultimately come up with a sufficiently bad language for anything.
Also, the way Haskell avoids null errors like this is with the static type system. So while it's certainly not inherent to all static type systems (then again, nothing has to be inherent to static type systems except being verified at compile time), it is a property of the type system.
I'm just 'arguing' that pitting static typing against dynamic typing using specific languages as examples isn't necessarily the whole picture. Saying that static typing will save you from attempting to call methods on None in Python is a fallacy. Saying that Haskell's static type system will save you, is possibly correct.
My original point was that it's possible that we (programmers) focus more on issues that could be with static typing (of some implementation) just because it seems like a group of problems that could be 'easily' solved. I.e. 'the grass is always greener'
Missing imports and redundant imports also go away.
Lots of lots of invariants in the program can be encoded as types, too, so any bugs relating to them go away too.
When you want parallelism, you get useful guarantees about not changing the deterministic result you had before you added parallelism.
I used to use Python, but after Haskell, there's no way I'd go back...