Yes - the number of languages will grow, however, their adoption will be much slower and harder to enact than now (and it's already incredibly difficult).
You might have written the DSLs, but the LLMs are unaware of this and will offer hallucinations when asked to generate code using that DSL.
For the past few weeks I've been slowly getting back to Common Lisp. Even though there's plenty of CL code on the net, its volume is dwarfed by Python or JS. In effect, both Github Copilot and ChatGPT (4o) have an accuracy of 5%. I'm not kidding: they're unable to generate even very simple snippets correctly, hallucinating packages and functions.
It's of course (I think?) possible to make a GPT specialized for Lisp, but if the generic model performs poorly, it'll probably make people wary and stay away from the language. So, unless you're ready to fine-tune a model for your language and somehow distribute it to your users, you'll see adoption rates dropping (from already minuscule ones!)
You might have written the DSLs, but the LLMs are unaware of this and will offer hallucinations when asked to generate code using that DSL.
For the past few weeks I've been slowly getting back to Common Lisp. Even though there's plenty of CL code on the net, its volume is dwarfed by Python or JS. In effect, both Github Copilot and ChatGPT (4o) have an accuracy of 5%. I'm not kidding: they're unable to generate even very simple snippets correctly, hallucinating packages and functions.
It's of course (I think?) possible to make a GPT specialized for Lisp, but if the generic model performs poorly, it'll probably make people wary and stay away from the language. So, unless you're ready to fine-tune a model for your language and somehow distribute it to your users, you'll see adoption rates dropping (from already minuscule ones!)