> If you have tens of thousands of variables, you definitely don't want to risk dealing with the simplex method's worst-case behavior.
The risk is extremely low. In fact, the risk is so low that, for deacdes, it was an open question why the simplex algorithm was so good in practice on a wide variety of workloads despite having poor worst-case complexity (unlike, for example, pure quicksort, which is often problematic in practice).
Spielman solved that problem in 2001 and showed that even very small perturbations from a contrived worst-case input have good performance: http://arxiv.org/pdf/cs/0111050.pdf
The risk is extremely low. In fact, the risk is so low that, for deacdes, it was an open question why the simplex algorithm was so good in practice on a wide variety of workloads despite having poor worst-case complexity (unlike, for example, pure quicksort, which is often problematic in practice).
Spielman solved that problem in 2001 and showed that even very small perturbations from a contrived worst-case input have good performance: http://arxiv.org/pdf/cs/0111050.pdf