Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

APL may save a lot of tokens for computing tasks, but not for other general tasks, such as backend development?

Also I guess LLM doesn't have enough APL codes in training dataset, which might be a big problem.

LLMs are still very good at popular languages, so moving to APL for general tasks is probably a bad choice.



As to the training set's depth of APL, yes, it's an issue. However, it's worth seeing how well MoonBit[1, 2] works with LLMs, faced with exactly the same problem -- integrating the LLM directly into the parser pipeline.

1: https://www.youtube.com/watch?v=SnY0F9w1xdM 2: https://www.moonbitlang.com/blog/moonbit-ai


Hongbo blocked me on Twitter, lol




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: