Feynman's lecture explains why classical computers are terrible at simulating quantum systems.
The basic problem is that the number of states grows exponentially with the size of the system. You very quickly have to start making approximations, and it takes an enormous amount of classical computing power and memory to handle even relatively small systems.
The approximations don't just introduce small errors. To simulate quantum systems classically, you need to make drastic assumptions that fundamentally change the nature of the system.
This is very different from, say, an approximation that adds in a small amount of noise that you can estimate. The approximations in simulating quantum systems classically can radically change the behavior of the system, in ways that you might not understand or be able to easily estimate.
The basic problem is that the number of states grows exponentially with the size of the system. You very quickly have to start making approximations, and it takes an enormous amount of classical computing power and memory to handle even relatively small systems.