You'd think so, but 3.5-turbo was multilingual from the get go and benefitted massively from it. If you want to position yourself as a global leader, then excluding 95% of the world who aren't English native speakers seems like a bad idea.
Constant infighting and most of the competent people leaving will do that to a company.
I mean more on a model performance level though. It's been shown that something trained in one language trains the model to be able to output it in any other language it knows. There's quality human data being left on the table otherwise. Besides, translation is one of the few tasks that language models are by far the best at if trained properly, so why not do something you can sell as a main feature?