Is it, though? The major selling point of coding LLMs is that you can use natural language to describe what you want. If minor changes to wording - the ones that would not make any difference with a human - can result in drastically worse results, that feels problematic for real-world scenarios.